Interferometer phase noise due to beam misalignment on diffraction gratings

Research output: Contribution to journalArticle

Authors

Colleges, School and Institutes

Abstract

All-reflective interferometer configurations have been proposed for the next generation of gravitational wave detectors, with diffractive elements replacing transmissive optics. However, an additional phase noise creates more stringent conditions for alignment stability. A framework for alignment stability with the use of diffractive elements was required using a Gaussian model. We successfully create such a framework involving modal decomposition to replicate small displacements of the beam (or grating) and show that the modal model does not contain the phase changes seen in an otherwise geometric planewave approach. The modal decomposition description is justified by verifying experimentally that the phase of a diffracted Gaussian beam is independent of the beam shape, achieved by comparing the phase change between a zero-order and first-order mode beam. To interpret our findings we employ a rigorous time-domain simulation to demonstrate that the phase changes resulting from a modal decomposition are correct, provided that the coordinate system which measures the phase is moved simultaneously with the effective beam displacement. This indeed corresponds to the phase change observed in the geometric planewave model. The change in the coordinate system does not instinctively occur within the analytical framework, and therefore requires either a manual change in the coordinate system or an addition of the geometric planewave phase factor.

Details

Original languageEnglish
Pages (from-to)29578-29591
JournalOptics Express
Volume21
Issue number24
Early online date22 Nov 2013
Publication statusPublished - 2 Dec 2013

Keywords

  • Diffraction gratings, Diffraction theory, Phase shift