Skip to Main Content
The automated detection of emotions opens the possibility to new applications in areas such as education, mental health and entertainment. There is an increasing interest on detection techniques that combine multiple modalities. In this study, we introduce automated techniques to detect users' affective states from a fusion model of facial videos and physiological measures. The natural behavior expressed on faces and their physiological responses were recorded from subjects (N=20) while they viewed images from the International Affective Picture System (IAPS). This paper provides a direct comparison between user-dependent, gender-specific, and combined-subject models for affect classification. The analysis indicates that the accuracy of the fusion model (head movement, facial color, and physiology) was statistically higher than the best individual modality for spontaneous affect expressions.