By Topic

Measuring posture features saliency in expressing affective states

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
De Silva, P.R. ; Database Syst. Lab, Aizu Univ., Japan ; Bianchi-Berthouze, N.

Today, creating systems that are capable of interacting naturally and efficiently with humans on many levels is essential. One step toward achieving this is the recognition of emotion from whole body postures of human partners. Currently, little research in this area exists in computer science. Therefore, our aim is to identify and measure the saliency of posture features that play a role in affective expression. As a case-study, we collected affective gestures from human subjects using a motion capture system. We first described these gestures with spatial features. Through standard statistical techniques, we verified that there was a statistically significant correlation between the emotion intended by the acting subjects, and the emotion perceived by the observers. We examined the use of discriminant analysis to measure the saliency of the proposed set of posture features in discriminating between 4 basic emotions: angry, fear, happy, and sad. Our results show that the set of features discriminates well between emotions, and also provides evidence about the strong overlap between descriptors in both acting and observing activities.

Published in:

Intelligent Robots and Systems, 2004. (IROS 2004). Proceedings. 2004 IEEE/RSJ International Conference on  (Volume:2 )

Date of Conference:

28 Sept.-2 Oct. 2004