Skip to Main Content
This paper presents a novel data-driven method for creating varied realistic face models by synthesizing a set of facial features according to intuitive high-level control parameters. Our method takes as examples 3D face scans in order to exploit the variations presented in the real faces of individuals. We use an automatic model fitting approach for the 3D registration problem. Once we have a common surface representation for each example, we form feature shape spaces by applying principal component analysis (PCA) to the data sets of facial feature shapes. Using PCA coefficients as a compact shape representation, we approach the shape synthesis problem by forming scattered data interpolation functions that are devoted to the generation of desired shape by taking the anthropometric parameters as input. The correspondence among all exemplar textures is obtained by parameterizing a 3D generic mesh over a 2D image domain. The new feature texture with desired attributes is synthesized by interpolating the example textures. Apart from an initial tuning of feature point positions and assignment of texture attribute values, our method is fully automated.