Skip to Main Content
This work presents a new approach for the 3D human facial expressions analysis. Our methodology is based on 2D and 3D wavelet transforms, which are used to estimate multi-scale features from real a face acquired by a 3D scanner. The proposed methodology starts by considering a dataset composed by faces displaying seven different facial expressions. An automatic pre-processing method, adopting an ellipsoidal cropping, is applied to the dataset. Thereafter, the 2D and 3D descriptors are extracted from different scales of wavelet transforms for the purpose of obtaining the facial expression features. The multi-scale features are represented in a multi-variate feature space, which is analysed by the Sequential Forward Floating Selection algorithm using an entropy criterion function to select the subset of features that best represents each facial expression model. The obtained results corroborate the potential of multi-scale feature extraction for analysis of 3D facial expression.