Skip to Main Content
We study the performance of steepest descent (SD) and least mean square (LMS) algorithms applied to linear detection for multiple-input multiple-output (MIMO) systems in a correlated Rayleigh fading environment. By using random matrix theory, we first study stability for a fixed step size parameter. Then, we consider two always-stable channel-adaptive strategies for the choice of the step size and analytically evaluate their performance. Finally, we derive bounds on the mean value of misadjustment for the LMS algorithm.