By Topic

Using body movement and posture for emotion detection in non-acted scenarios

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Garber-Barron, M. ; Cognitive Sci. Dept., Rensselaer Polytech. Inst., Troy, NY, USA ; Mei Si

In this paper, we explored the use of features that represent body posture and movement for automatically detecting people's emotions in non-acted standing scenarios. We focused on four emotions that are often observed when people are playing video games: triumph, frustration, defeat, and concentration. The dataset consists of recordings of the rotation angles of the player's joints while playing Wii sports games. We applied various machine learning techniques and bagged them for prediction. When body pose and movement features are used we can reach an overall accuracy of 66.5% for differentiating between these four emotions. In contrast, when using the raw joint rotations, limb rotation movement, or posture features alone, we were only able to achieve accuracy rates of 59%, 61%, and 62% respectively. Our results suggest that features representing changes in body posture can yield improved classification rates over using static postures or joint information alone.

Published in:

Fuzzy Systems (FUZZ-IEEE), 2012 IEEE International Conference on

Date of Conference:

10-15 June 2012