Skip to Main Content
This paper introduces an efficient intra-field deinterlacing algorithm that is based on Taylor series expansion and polynomial regression. In order to estimate the value of an interpolated point using the given data, we rely on a generic local approximation function around this point for estimating the missing data. The well known N-term Taylor series expansion is regarded as a local representation of the approximation function, and we use polynomial regression to find the optimal local approximation of the function. Instead of estimating the edge orientations as in previous intra-field deinterlacing methods, such as an edge-based line average, we propose an efficient deinterlacing method, which does not consider directional difference measurements that use limited candidate directions. When compared with existing deinterlacing algorithms, the proposed algorithm improves the peak signal-to-noise-ratio while maintaining a high efficiency.