Skip to Main Content
In this paper, the impact of clipping noise on optical wireless communication (OWC) systems employing orthogonal frequency division multiplexing (OFDM) is investigated. The two existing optical OFDM (O-OFDM) transmission schemes, asymmetrically clipped optical OFDM (ACO-OFDM) and direct-current-biased optical OFDM (DCO-OFDM), are studied. Time domain signal clipping generally results from direct current (DC) biasing and/or from physical limitations of the transmitter front-end. These include insufficient forward biasing and the maximum power driving limit of the emitter. The clipping noise can be modeled according to the Bussgang theorem and the central limit theorem (CLT) as attenuation of the data-carrying subcarriers at the receiver and addition of zero-mean complex-valued Gaussian noise. Analytical expressions for the attenuation factor and the clipping noise variance are determined in closed-form and employed in the derivation of the electrical signal-to-noise ratio (SNR). The validity of the model is verified through a Monte Carlo bit-error ratio (BER) simulation. Finally, the BER performance of ACO-OFDM with DCO-OFDM is compared for different clipping levels and multi-level quadrature amplitude modulation (M-QAM) schemes.