By Topic

Emotional boundaries for choosing modalities according to the intensity of emotion in a linear affect-expression space

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

5 Author(s)
Jeong Woo Park ; School of Electrical Engineering and Computer Science, Division of Electrical Engineering, KAIST, Daejeon, Korea ; Hui Sung Lee ; Su Hun Jo ; Min-gyu Kim
more authors

Recently, in the field of HRI, multimodal expression has been an issue. Synchronizing modalities and determining what modality to use are important aspect of multimodal expression. For example, when robots express emotional states, they may use only facial expressions or facial expressions with gestures, neck motions, sounds, etc. In this paper, emotional boundaries are proposed for multimodal expression in a three-dimensional affect space. The simultaneous expression of facial expression and gestures was demonstrated using proposed emotional boundaries on a simulator.

Published in:

RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication

Date of Conference:

1-3 Aug. 2008