By Topic

Classification of affects using head movement, skin color features and physiological signals

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Monkaresi, H. ; Sch. of Electr. & Inf. Eng., Univ. of Sydney, Sydney, NSW, Australia ; Hussain, M.S. ; Calvo, R.A.

The automated detection of emotions opens the possibility to new applications in areas such as education, mental health and entertainment. There is an increasing interest on detection techniques that combine multiple modalities. In this study, we introduce automated techniques to detect users' affective states from a fusion model of facial videos and physiological measures. The natural behavior expressed on faces and their physiological responses were recorded from subjects (N=20) while they viewed images from the International Affective Picture System (IAPS). This paper provides a direct comparison between user-dependent, gender-specific, and combined-subject models for affect classification. The analysis indicates that the accuracy of the fusion model (head movement, facial color, and physiology) was statistically higher than the best individual modality for spontaneous affect expressions.

Published in:

Systems, Man, and Cybernetics (SMC), 2012 IEEE International Conference on

Date of Conference:

14-17 Oct. 2012