Skip to Main Content
We consider two widely referenced trained finite-length linear equalizers, namely, the mismatched minimum mean square error (MMSE) equalizer and the least-squares (LS) equalizer. Using matrix perturbation theory, we express both of them as perturbations of the ideal MMSE equalizer and we derive insightful analytical expressions for their excess mean square error. We observe that, in general, the mismatched MMSE equalizer performs (much) better than the LS equalizer. We attribute this phenomenon to the fact that the LS equalizer implicitly estimates the input second-order statistics, while the mismatched MMSE equalizer uses perfect knowledge. Thus, assuming that the input second-order statistics are known at the receiver, which is usually the case, the use of the mismatched MMSE equalizer is preferable, in general.