By Topic

Empathic Touch by Relational Agents

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Timothy W. Bickmore ; Northeastern University, Boston ; Rukmal Fernando ; Lazlo Ring ; Daniel Schulman

We describe a series of experiments with an agent designed to model human conversational touch-capable of physically touching users in synchrony with speech and other nonverbal communicative behavior-and its use in expressing empathy to users in distress. The agent is composed of an animated human face that is displayed on a monitor affixed to the top of a human mannequin, with touch conveyed by an air bladder that squeezes a user's hand. We demonstrate that when touch is used alone, hand squeeze pressure and number of squeezes are associated with user perceptions of affect arousal conveyed by an agent, while number of squeezes and squeeze duration are associated with affect valence. We also show that, when affect-relevant cues are present in facial display, speech prosody, and touch used simultaneously by the agent, facial display dominates user perceptions of affect valence, and facial display and prosody are associated with affect arousal, while touch had little effect. Finally, we show that when touch is used in the context of an empathic, comforting interaction (but without the manipulation of affect cues in other modalities), it can lead to better perceptions of relationship with the agent, but only for users who are comfortable being touched by other people.

Published in:

IEEE Transactions on Affective Computing  (Volume:1 ,  Issue: 1 )