Gravitational waves (GWs) are a prediction of Einstein’s general theory of relativity, which extends the theory of gravitation by renouncing the instantaneous action at a distance that was shocking to Isaac Newton himself and had already became unacceptable after the special theory of relativity. The gravitational interaction is now carried by a wave messenger at the speed of light, a gravitational wave. However, the efficiency of the conversion of any kind of energy into gravitational radiation is extremely weak, so that emitting and/or detecting such waves has for decades been considered well outside experimental possibilities. The situation changed after the technological expansion in the 1960s. Joseph Weber was the first to propose an experiment aiming to detect GWs of astrophysical origin. However, the initial Weber experiment was still too simple to detect anything of astrophysical interest. This motivated theorists to work out more accurate estimates of the GW signals produced by astrophysical cataclysms such as supernovae, binary coalescences, fast spinning neutron stars etc…Readers interested in this part of the history of the field can consult the review by Thorne . Is was soon noted that optical interferometers of the Michelson type had exactly the right topology with respect to the gravitational wave polarization, had a large potential sensitivity and were able to produce electrical signals analog to the gravitational waveforms, being intrinsically wide-band transducers. After several prototypes of various sizes were built (U.S.A., U.K., Germany), and following the pioneering work of Ronald Drever of Caltech, Rainer Weiss of MIT was the first to study the technological issues specific to large size interferometric antennas and to determine the general principles of ground based antennas. This was the seed of the LIGO project in the U.S.A. , of the British-German GEO, unfortunately aborted, and of the French-Italian Virgo . Despite these efforts and after construction of kilometer size antennas, GWs have yet to be detected because of the still too low sensitivity of present antennas (LIGO, Virgo). It was foreseen from the beginning that technological breakthroughs would allow the sensitivity to be enhanced in the near future. This is the present situation, and the R&D of “advanced detectors” has already begun. One aspect of these advanced detectors is an improved use of light for reading the tiny apparent variations of distances between test masses.
Ground-based interferometers for GW detection are made of silica pieces (the substrates of the mirrors) hanging in a vacuum. Detection of GWs requires the continuous measurement of the flight time of photons between two mirrors facing each other, or, in other words, the reflected phase off a Fabry–Pérot cavity. A passing GW is expected to have a differential effect on the phases of two orthogonal cavities. This is why the Michelson configuration is well adapted to GW antennas. It is classically shown that the sensitivity of a Michelson interferometer ultimately depends on the square root of the light source’s power. This is a strong reason to increase the input laser power. However, there are at least four issues to solve before such an improvement can be made. The first is that, even with high quality materials, a fraction of the power is absorbed by the material (either in the bulk or on the coating); this gives rise to a source of heat at the surface or in the bulk, and there is consequently a temperature field in the material, which results in turn in a refractive index field and a thermal distortion of the substrate. These defects cause mismatching of the interferometer, and therefore, already in the present status of LIGO-Virgo, require complex thermal compensation systems. Before increasing the incident power, some new ideas would be welcome. The second issue comes from the fact that in the region of 100 Hz, the sensitivity is not limited by shot noise, but rather by the thermal noise of mirrors. Mirror substrates may be viewed as elastodynamical oscillators, whose modes are excited at room temperature resulting in a fluctuating reflecting face. Increasing the laser power will be of no use in this strategic spectral region unless a means of reducing the effect of thermal noise is found. There is still another source of noise, called thermoelastic, due to temperature fluctuations in the material. The fourth issue is the effect of radiation pressure on the suspended mirrors. Increasing the laser power will cause increasing fluctuations in the radiation pressure, so that there is an optimum in the laser power, dependant on the cavity parameters, giving the standard quantum limit. In the context of the R&D of advanced detectors, several ways of reducing the thermal noise have been proposed: using new high-Q materials , cooling to cryogenic temperatures  or active correction . Changing the geometry of the readout beam, in order to reduce the optical coupling with surface fluctuations, has also been proposed. Regarding this track, there was a proposal [11, 40] to go towards nonspherical mirrors generating a more-or-less homogeneous light-intensity profile. There was another proposal  in the same spirit but keeping spherical mirrors and using high-order Gaussian modes. Some other proposals are also considered.
Thermal effects, the various thermal noises (Brownian, thermoelastic, thermorefractive) have been extensively studied and reported in the literature. We focus here on their dependence to the transverse structure of the optical readout beam and try particularly to give general formulas for arbitrary order Laguerre–Gauss modes.
Further material will be added to the present review with coming developments, especially regarding experimental results. But we think it is useful to present already available results during the present R&D phase.
This work is licensed under a Creative Commons License.