Cart (Loading....) | Create Account
Close category search window
 

Analyses of a Multimodal Spontaneous Facial Expression Database

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

7 Author(s)
Shangfei Wang ; Sch. of Comput. Sci. & Technol., Univ. of Sci. & Technol. of China, Hefei, China ; Zhilei Liu ; Zhaoyu Wang ; Guobing Wu
more authors

Creating a large and natural facial expression database is a prerequisite for facial expression analysis and classification. It is, however, not only time consuming but also difficult to capture an adequately large number of spontaneous facial expression images and their meanings because no standard, uniform, and exact measurements are available for database collection and annotation. Thus, comprehensive first-hand data analyses of a spontaneous expression database may provide insight for future research on database construction, expression recognition, and emotion inference. This paper presents our analyses of a multimodal spontaneous facial expression database of natural visible and infrared facial expressions (NVIE). First, the effectiveness of emotion-eliciting videos in the database collection is analyzed with the mean and variance of the subjects' self-reported data. Second, an interrater reliability analysis of raters' subjective evaluations for apex expression images and sequences is conducted using Kappa and Kendall's coefficients. Third, we propose a matching rate matrix to explore the agreements between displayed spontaneous expressions and felt affective states. Lastly, the thermal differences between the posed and spontaneous facial expressions are analyzed using a paired-samples t-test. The results of these analyses demonstrate the effectiveness of our emotion-inducing experimental design, the gender difference in emotional responses, and the coexistence of multiple emotions/expressions. Facial image sequences are more informative than apex images for both expression and emotion recognition. Labeling an expression image or sequence with multiple categories together with their intensities could be a better approach than labeling the expression image or sequence with one dominant category. The results also demonstrate both the importance of facial expressions as a means of communication to convey affective states and the diversity of the displayed ma- ifestations of felt emotions. There are indeed some significant differences between the temperature difference data of most posed and spontaneous facial expressions, many of which are found in the forehead and cheek regions.

Published in:

Affective Computing, IEEE Transactions on  (Volume:4 ,  Issue: 1 )

Date of Publication:

Jan.-March 2013

Need Help?


IEEE Advancing Technology for Humanity About IEEE Xplore | Contact | Help | Terms of Use | Nondiscrimination Policy | Site Map | Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest professional association for the advancement of technology.
© Copyright 2014 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.