By Topic

A robust spatial-temporal line-warping based deinterlacing method

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

5 Author(s)
Shing-Fat Tu ; Dept. of Electron. & Comput. Eng., Hong Kong Univ. of Sci. & Technol., Kowloon, China ; Au, O.C. ; Yannan Wu ; Enming Luo
more authors

In this paper, a line-warping based deinterlacing method will be introduced. The missing pixels in interlaced videos can be derived from the warping of pixels in horizontal line pairs. In order to increase the accuracy of temporal prediction, multiple temporal-line pairs, selected according to constant velocity model, are used for warping. The stationary pixels can be well-preserved by accuracy stationary detection. A soft switching between spatial-temporal interpolated values and temporal average is introduced in order to prevent unstable switching. Owing to above novelties, the proposed method can yield higher visual quality deinterlaced videos than conventional methods. Moreover, this method can suppress most deinterlaced visual artifacts, such as line-crawling, flickering and ghost-shadow.

Published in:

Multimedia and Expo, 2009. ICME 2009. IEEE International Conference on

Date of Conference:

June 28 2009-July 3 2009