Skip to Main Content
A K-user direct-sequence spread-spectrum code- division multiple-access (CDMA) system with (q << log2 K) -bit baseband signal quantization at the demodulator is considered. It is shown that additionally quantizing the K + 1 level output signal of the CDMA modulator into q bits improves significantly the average bit-error performance in a non-negligible regime of noise variance, sigma2, and user load, beta, under various system settings, like additive white Gaussian noise (AWGN), Rayleigh fading, single-user detection, multi-user detection, random and orthogonal spreading codes. For the case of single-user detection in random spreading AWGN-CDMA, this regime is identified explicitly as sigma < gamma(q)yradicbeta, where gamma(q) is a certain pre-factor depending on q, and the associated BER improvement is derived analytically for q = 1, 2. For the other examined system settings, computer simulations are provided, corroborating this interesting behavior.