We are currently experiencing intermittent issues impacting performance. We apologize for the inconvenience.
By Topic

Facial expressive robotic head system for human-robot communication and its application in home environment

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

5 Author(s)
Fukuda, T. ; Dept. of Micro Syst. Eng., Nagoya Univ., Japan ; Myung-Jin Jung ; Nakashima, M. ; Arai, F.
more authors

This paper describes a robotic-head system as a multimodal communication device for human-robot interaction, and the system's potential application in home environments. Most robotic systems for natural user interaction have facial expressions, since facial expressiveness is regarded as a key component to developing personal attachment along with prosodic expressiveness. In the first part of the paper is the description of our robotic head system Character Robot Face (CRF). A deformation approach and a parametric normalization scheme are proposed to produce facial expressions of nonhuman face models with high recognition rates. In the second half of the paper, CRF is endowed with artificial emotions and assigned tasks conceivable in home environments. A coordination mechanism between the robot's mood (an activated emotion) and its task is proposed so that the robot can, by referring to the emotion-task history, select a task depending on its current mood if there is no explicit task command from the user. When the robot performs a task, a particular emotion value gets boosted according to the same emotion-task history so that the emotion is more likely to be activated.

Published in:

Proceedings of the IEEE  (Volume:92 ,  Issue: 11 )