Skip to Main Content
This paper presents a system for inferring complex mental states from video of facial expressions and head gestures in real-time. The system is based on a multi-level dynamic Bayesian network classifier which models complex mental states as a number of interacting facial and head displays, identified from component-based facial features. Experimental results for 6 mental states groups- agreement, concentrating, disagreement, interested, thinking and unsure are reported. Real-time performance, unobtrusiveness and lack of preprocessing make our system particularly suitable for user-independent human computer interaction.
Date of Conference: 27-02 June 2004