Skip to Main Content
This paper is devoted to the study of the performance of the linear minimum mean-square error (LMMSE) receiver for (receive) correlated multiple-input multiple-output (MIMO) systems. By the random matrix theory, it is well known that the signal-to-noise ratio (SNR) at the output of this receiver behaves asymptotically like a Gaussian random variable as the number of receive and transmit antennas converge to +infin at the same rate. However, this approximation being inaccurate for the estimation of some performance metrics such as the bit error rate (BER) and the outage probability, especially for small system dimensions, Li proposed convincingly to assume that the SNR follows a generalized gamma distribution which parameters are tuned by computing the first three asymptotic moments of the SNR. In this paper, this technique is generalized to (receive) correlated channels, and closed-form expressions for the first three asymptotic moments of the SNR are provided. To obtain these results, a random matrix theory technique adapted to matrices with Gaussian elements is used. This technique is believed to be simple, efficient, and of broad interest in wireless communications. Simulations are provided, and show that the proposed technique yields in general a good accuracy, even for small system dimensions.