Skip to Main Content
In dispersion compensated systems, the intensity distortion induced by the interplay between cross-phase modulation and fiber chromatic dispersion can be a primary cause of transmission degradation. This interplay is mostly studied by time-consuming computer simulations. This letter introduces a new model of this interplay in fiber transmissions with dispersion compensation, leading to a linear filter that, applied to the input intensity of a modulated interfering channel, gives the intensity distortion of a continuous-wave probe signal at the receiver. The model can be of significant value in the search for optimized dispersion maps.