Skip to Main Content
The analysis of linear minimum mean-square error (MMSE) detection in a band-limited code-division multiple-access (CDMA) system that employs random spreading sequences is considered. The key features of the analysis are that the users are allowed to be completely asynchronous, and that the chip waveform is assumed to be the ideal Nyquist sinc function. It is shown that the asymptotic signal-to-interference ratio (SIR) at the detector output is the same as that in an equivalent chip-synchronous system. It is hence been established that synchronous analyses of linear MMSE detection can provide useful guidelines for the performance in asynchronous band-limited systems.