Skip to Main Content
Interferometry is a common technique used in fiber sensing that requires a demodulation algorithm to extract the signal of interest. Of great interest for sensor characterization is the performance of the demodulation scheme under the influence of input noise. Here we consider correlated, biased intensity noise corrupting an interferometer. We analytically compute a probability density function of this output noise only and use this to compute low-order statistical moments of the output noise. We compare the analytical formulations with simulated data from a representative demodulation scheme used in a currently existing fiber Bragg grating sensor system and find excellent agreement within the example of Gaussian input noise.