By Topic

FAP extraction using three-dimensional motion estimation

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
N. Sarris ; Aristotle Univ. of Thessaloniki, Greece ; N. Grammalidis ; M. G. Strintzis

An integral part of the MPEG-4 standard is the definition of face animation parameters (FAPs). This paper presents a method for the determination of FAPs by using three dimensional (3-D) rigid and nonrigid motion of human facial features found from two-dimensional (2-D) image sequences. The proposed method assumes that a 3-D model has been fitted to the first frame of the sequence, tracks the motion of characteristic facial features, calculates the 3-D rigid and nonrigid motion of facial features, and through this, estimates the FAPs as defined by the MPEG-4 coding standard. The 2-D tracking process is based on a novel enhanced version of the algorithm proposed by Kanade, Lucas, and Tomasi (1994, 1991). The nonrigid motion estimation is achieved using the same tracking mechanism guided by the facial motion model implied by the MPEG-4 FAPs.

Published in:

IEEE Transactions on Circuits and Systems for Video Technology  (Volume:12 ,  Issue: 10 )