Skip to Main Content
We address the problem of reconstructing a random signal from samples of its filtered version using a given interpolation kernel. In order to reduce the mean squared error (MSE) when using a nonoptimal kernel, we propose a high rate interpolation scheme in which the interpolation grid is finer than the sampling grid. A digital correction system that processes the samples prior to their multiplication with the shifts of the interpolation kernel is developed. This system is constructed such that the reconstructed signal is the linear minimum MSE (LMMSE) estimate of the original signal given its samples. An analytic expression for the MSE as a function of the interpolation rate is provided, which leads to an explicit condition such that the optimal MSE is achieved with the given nonoptimal kernel. Simulations confirm the reduction in MSE with respect to a system with equal sampling and reconstruction rates.