Skip to Main Content
We present an algorithm for 3D face modeling from a frontal image and a profile image of a person's face. The algorithm starts by computing the 3D coordinates of automatically extracted facial feature points. The coordinates of the selected feature points are then used to deform a 3D generic face model to obtain a 3D face model for that person. Procrustes analysis is used to minimize globally the distances between the facial feature vertices in the model and the corresponding 3D points obtained from the images. Then, local deformation is performed on the facial feature vertices to obtain a more realistic 3D model for the person. Preliminary experiments to asses the applicability of the models for face recognition show encouraging results.