Skip to Main Content
In this paper, the authors propose selective mapping (SLM) and partial transmit sequences (PTS) employing an intermodulation distortion (IMD)-reduction strategy to improve the error-probability performance of nonlinearly distorted orthogonal frequency division multiplexing (OFDM). In particular, the authors consider two IMD-reduction criteria: One requires knowledge of nonlinearity parameters, whereas the other does not. Simulation results demonstrate that in the presence of nonlinearities, OFDM systems using SLM or PTS with either of the IMD-reduction strategies perform better than those with peak-to-average power ratio (PAPR) or the recently proposed excess power (EP)-reduction strategies. Furthermore, the error-probability performance of the IMD-reduction technique that exploits knowledge of nonlinearity parameters is only slightly better than that of the technique that does not. Additionally, simulation results also demonstrate that the average out-of-band power generated by the IMD-reduction strategies is similar to that generated by PAPR or EP reduction.