The effect of the 56MHz modulation depth changes is clearly visible in the MICH/SRCL error signal noise levels, but one needs to to choose appropriate FFT lengths or data duration, to be able to see small changes in wideband noise. The figures in the logbook have factor 3 statistical fluctuations, so one cannot conclude much from them.
Figure 1 shows spectra with 1s long FFT, and one minute of data in each case. In purple is the 12dBm case, in red the 15dBm case and in blue the 9dBm case. For SRCL, the three curves are separated by a factor ~1.4 each, which is what one would expect from steps of 3dB in modulation. For MICH the 9dBm case is clearly worse, (by about 1.3), and the 12dBm and 15dBm are the same, so it seems that the MICH noise sensing becomes limited by something else when going above 12dBm.
Figure 2, looking at later data when the modulation depth was reduced
Figure 3, comparing the 15dBm on 6MHz and 56MHz case (purple) and 12dBm on 6MHz and 56MHz (blue), with 5min for each time. MICH and SRCL are both with the expected factor ~1.4 lower sensing noise. And h(t) also improve slightly just below 20Hz, and between 40Hz and 45Hz, which are the two frequency bands where there was coherence between DARM and SRCL
Figure 4 shows the coherence between the SRCL and DARM / h(t) for the same times, and indeed the coherence with SRCL is smaller when the modulation depth is larger