Skip to Main Content
This paper presents a performance analysis of the least squares (LS) based estimation of a linear time-invariant (LTI) channel. Given the inputs to a finite impulse response (FIR) channel and the channel outputs corrupted by noise, the channel impulse response is estimated using the Recursive LS (RLS) algorithm. The analysis found in the literature relies on the assumption that the expectation of the inverse of the sample covariance matrix is approximately equal to the scaled inverse of the true covariance matrix, which holds true when the number of observations is very large. To characterize the performance of the algorithm when the number of observations is small to moderate, some results from the theory of large dimensional random matrices are exploited. The expressions for the mean square value of the channel estimation and signal prediction errors are derived. These expressions closely match the results obtained from the simulations. It is also shown that at lower signal-to-noise ratios (SNR), a deterioration in the performance appears when the number of observations is around the channel length. This effect, which owns its manifestation to the very nature of the RLS algorithm, is explained and theoretically characterized.
Date of Conference: 28-30 Sept. 2011