By Topic

Towards automatic performance-driven animation between multiple types of facial model

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $31
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Cosker, D. ; Sch. of Comput. Sci., Univ. of Bath, Bath ; Borkett, R. ; Marshall, D. ; Rosin, P.L.

The authors describe a method to re-map animation parameters between multiple types of facial model for performance-driven animation. A facial performance can be analysed in terms of a set of facial action parameter trajectories using a modified appearance model with modes of variation encoding specific facial actions which can be pre-defined. These parameters can then be used to animate other modified appearance models or 3D morph-target-based facial models. Thus, the animation parameters analysed from the video performance may be re-used to animate multiple types of facial model. The authors demonstrate the effectiveness of the proposed approach by measuring its ability to successfully extract action parameters from performances and by displaying frames from example animations. The authors also demonstrate its potential use in fully automatic performance-driven animation applications.

Published in:

Computer Vision, IET  (Volume:2 ,  Issue: 3 )