Skip to Main Content
Multiple-antenna concepts for wireless communication systems promise high spectral efficiency and low error rates by proper exploitation of the randomness in multipath propagation. In this paper, we investigate the impact of channel uncertainty caused by channel estimation errors on the error rate performance. We consider a training-based multiple-antenna system that reserves a portion of time to sound the channel. Training symbols are used to estimate the channel by means of an arbitrary linear filter at the receiver. No channel state information (CSI) is assumed at the transmitter. We present a new framework to analyze training-based multiple-antenna systems by introducing an equivalent system model that specifies the channel by the estimated (and hence, known) channel coefficients and an uncorrelated, data-dependent, multiplicative noise. We derive the maximum-likelihood (ML) detector and highlight its behavior in the limiting cases of perfect CSI and no CSI, and its relation to several mismatched detectors. We deduce new exact expressions and Chernoff bounds of the pairwise error probability (PEP) used to assess word-error and bit-error rate bounds for ML and mismatched detection. Finally, we review the code design guidelines in terms of the deleterious effect of channel uncertainty for coherent and noncoherent signaling schemes, and present numerical results.