Skip to Main Content
In this paper, we consider the performance of blind maximum likelihood sequence detection (MLSD) when the recursive least-squares (RLS) algorithm is used to update channel estimates. We employ asymptotic efficiency analysis to characterize the performance of the detector as the signal-to-noise ratio (SNR) approaches infinity. Asymptotic efficiency analysis allows us to quantify the loss in performance due to the presence of intersymbol interference (ISI) and the lack of channel knowledge. We show that, under certain conditions, the asymptotic efficiency of the detector depends only on a single most-likely noise realization. Our results indicate that the performance of the RLS-based detector is strongly dependent on both the magnitude of the ISI and the number of data samples available.