Skip to Main Content
Both the configuration of facial features and the timing of facial actions are important to emotion and communication. Previous literature has focused on the former. We developed an automatic facial expression analysis system that quantifies the timing of facial actions as well as head and eye motion during spontaneous facial expression. To assess coherence among these modalities, we recorded and analyzed spontaneous smiles in 62 young women of varied ethnicity ranging in age from 18 to 35 years. Spontaneous smiles occurred following directed facial action tasks, a situation likely to elicit spontaneous smiles of embarrassment. Smiles (AU 12) were manually FACS coded by certified FACS coders. The 3D head motion was recovered using a cylindrical head model. The motion vectors for lip-corner displacement were measured using feature-point tracking. The eye closure and the horizontal and vertical eye motion (from which to infer direction of gaze or visual regard) were measured by the generative model fitting approach. The mean correlation within subjects between lip-corner displacement, head motion, and eye motion ranged from +/0.36 to 0.50, which suggests moderate coherence among these features. Lip-corner displacement and head pitch were negatively correlated, as predicted for smiles of embarrassment. These findings are consistent with recent research in psychology suggesting that facial actions are embedded within coordinated motor structures. They suggest that the direction of correlation among features may discriminate between facial actions with similar morphology but different communicative meaning, inform automatic facial expression recognition, and provide normative data for animating computer avatars.