Skip to Main Content
The number of coefficients for estimating time-varying channel is generally much larger than the maximum number of observable data. Linearly time-varying channel (LTV) models have been widely used as a way of reducing the number of these channel parameters. This approach allows estimation using the channel slope and channel frequency response between symbols. A fatal problem with this approach, however, is that before it is even initiated, the estimated channel in the frequency domain has already been degraded by inter-carrier interference (ICI). We demonstrate this by analyzing the channel estimation mean square error (MSE) and signal to interference and noise ratio (SINR) after ICI cancellation with respect to the LTV model. We then propose a new channel estimation technique using dual-ICI cancellation which resolves this problem by pre-canceling ICI for the channel estimation using the channel of the previous symbol and post-canceling ICI for symbol detection with the more accurate channel information. The process is initialized using the first symbol as a preamble to estimate the channel in the time domain. Performance comparisons of MSE and SINR show that the proposed method is better suited to fast-fading channels than the conventional method, where ICI dominates the interference and noise power.