Skip to Main Content
A model suitable for analyzing the nonlinear interaction between signal and noise, mediated by the Kerr effect in optical communication systems, is presented. This model treats separately signal and noise and permits analysis of the symbols' central time position and frequency evolution. It is shown that this nonlinear interaction between signal and noise leads to symbols' random frequency shifts, which induce timing jitter in all types of systems. We also discuss the problem of estimating timing jitter for a signal embedded in noise.