Skip to Main Content
We use computer simulation techniques to evaluate the limitations on laser misalignment tolerances for multiwavelength optical networks with a cascade of 2 to 100 (de)multiplexers modelled as either first-order filters or third-order Butterworth filters. The results are dependent on the number of (de)multiplexers used, on the bit rate, and on the filter characteristics. We find that both the magnitude and the phase characteristics of the (de)multiplexer transfer function are important in determining the distortion-induced penalties. The allowable laser misalignment tolerances, at 10 Gb/s and for systems using (de)multiplexers modelled as third-order Butterworth filters, vary from /spl plusmn/78 GHz (/spl plusmn/0.63 nm) for systems with a cascade of 2 filters to /spl plusmn/18 GHz (/spl plusmn/0.15 nm) for systems with a cascade of 100 filters.