By Topic

Intraindividual and interindividual multimodal emotion analyses in Human-Machine-Interaction

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

7 Author(s)
Ronald Böck ; Cognitive Systems, Otto-von-Guericke University Magdeburg, P.O. 4120, 39016 Magdeburg, Germany ; Stefan Glüge ; Andreas Wendemuth ; Kerstin Limbrecht
more authors

Interactive computer systems today interact nearly effortlessly with humans by menu-driven mouse- and text-based input. In case of other modalities like audio and gesture control systems still lack on flexibility. To respond appropriately, these intelligent systems require specific cues about the user's internal state. Reliable emotion recognition of technical systems is therefore an important issue in computer sciences and applications. In order to develop an appropriate methodology for emotion analyses, a multimodal study is introduced here. Audio and video signals as well as biopsychological signals of the user are applied to detect intraindividual behavioural prototypes that can be used for predictions of the user's emotional states. Additionally, interindividual differences are considered and discussed. Statistical analyses showed results in most cases with statistical significance of probability value p <; 0.05 and an effect size d >; 1.05.

Published in:

2012 IEEE International Multi-Disciplinary Conference on Cognitive Methods in Situation Awareness and Decision Support

Date of Conference:

6-8 March 2012