Scheduled System Maintenance:
Some services will be unavailable Sunday, March 29th through Monday, March 30th. We apologize for the inconvenience.
By Topic

Affective Computing, IEEE Transactions on

Issue 1 • Date Jan. 2010

Filter Results

Displaying Results 1 - 7 of 7
  • Editorial

    Publication Year: 2010 , Page(s): 1 - 10
    Save to Project icon | Request Permissions | PDF file iconPDF (421 KB)  
    Freely Available from IEEE
  • Affective Computing: From Laughter to IEEE

    Publication Year: 2010 , Page(s): 11 - 17
    Cited by:  Papers (30)
    Save to Project icon | Request Permissions | PDF file iconPDF (153 KB)  
    Freely Available from IEEE
  • Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications

    Publication Year: 2010 , Page(s): 18 - 37
    Cited by:  Papers (70)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2600 KB)  

    This survey describes recent progress in the field of Affective Computing (AC), with a focus on affect detection. Although many AC researchers have traditionally attempted to remain agnostic to the different emotion theories proposed by psychologists, the affective technologies being developed are rife with theoretical assumptions that impact their effectiveness. Hence, an informed and integrated examination of emotion theories from multiple areas will need to become part of computing practice if truly effective real-world systems are to be achieved. This survey discusses theoretical perspectives that view emotions as expressions, embodiments, outcomes of cognitive appraisal, social constructs, products of neural circuitry, and psychological interpretations of basic feelings. It provides meta-analyses on existing reviews of affect detection systems that focus on traditional affect detection modalities like physiology, face, and voice, and also reviews emerging research on more novel channels such as text, body language, and complex multimodal systems. This survey explicitly explores the multidisciplinary foundation that underlies all AC applications by describing how AC researchers have incorporated psychological theories of emotion and how these theories affect research questions, methods, results, and their interpretations. In this way, models and methods can be compared, and emerging insights from various disciplines can be more expertly integrated. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Smile When You Read This, Whether You Like It or Not: Conceptual Challenges to Affect Detection

    Publication Year: 2010 , Page(s): 38 - 41
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (112 KB)  

    The survey by Calvo and D'Mello presents a useful overview of the progress of and issues in affect detection. They focus on emotion theories that are relevant to Affective Computing (AC) and suggest stronger collaborations between disciplines. My contribution emphasizes the importance of these issues for AC. In fact, empirical research strongly suggests that facial, vocal, and bodily expressions, subjective experience, and physiological changes are often not very highly correlated in spontaneous situations. Overestimating this cohesion limits the usefulness of affect detection methods in real-world applications. Other factors, such as social context, knowledge regarding the goals of certain interactions, as well as interindividual differences are critically important factors for improving affect detection. At times, social concepts, such as politeness, might be more conducive to model realistic behavior. Knowledge on affect perception is important to estimate the level of realism required to create satisfying and productive interactions between users and artificial systems. Interdisciplinary joint research between social and biological scientists on the one hand and computer scientists and engineers on the other is necessary to deal with the complexity of affective processes. All disciplines involved have much to gain in the process. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Broadening the Scope of Affect Detection Research

    Publication Year: 2010 , Page(s): 42 - 45
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (131 KB)  

    I propose to broaden the scope of affect detection research in three related directions. First, the task definition should be broadened from the detection of affects to the inference of mental states. Second, the detection process should be reconceptualized as an inference to the best explanation that makes essential use of a theory of mind. Third, additional data sources should be utilized to infer emotions: information about the situation (eliciting events), information about emotion-related goal-directed actions, and self-reports about emotions. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • CAO: A Fully Automatic Emoticon Analysis System Based on Theory of Kinesics

    Publication Year: 2010 , Page(s): 46 - 59
    Cited by:  Papers (5)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4508 KB)  

    This paper presents CAO, a system for affect analysis of emoticons in Japanese online communication. Emoticons are strings of symbols widely used in text-based online communication to convey user emotions. The presented system extracts emoticons from input and determines the specific emotion types they express with a three-step procedure. First, it matches the extracted emoticons to a predetermined raw emoticon database. The database contains over 10,000 emoticon samples extracted from the Web and annotated automatically. The emoticons for which emotion types could not be determined using only this database, are automatically divided into semantic areas representing “mouths” or “eyes,” based on the idea of kinemes from the theory of kinesics. The areas are automatically annotated according to their co-occurrence in the database. The annotation is first based on the eye-mouth-eye triplet, and if no such triplet is found, all semantic areas are estimated separately. This provides hints about potential groups of expressed emotions, giving the system coverage exceeding 3 million possibilities. The evaluation, performed on both training and test sets, confirmed the system's capability to sufficiently detect and extract any emoticon, analyze its semantic structure, and estimate the potential emotion types expressed. The system achieved nearly ideal scores, outperforming existing emoticon analysis systems. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Empathic Touch by Relational Agents

    Publication Year: 2010 , Page(s): 60 - 71
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1975 KB)  

    We describe a series of experiments with an agent designed to model human conversational touch-capable of physically touching users in synchrony with speech and other nonverbal communicative behavior-and its use in expressing empathy to users in distress. The agent is composed of an animated human face that is displayed on a monitor affixed to the top of a human mannequin, with touch conveyed by an air bladder that squeezes a user's hand. We demonstrate that when touch is used alone, hand squeeze pressure and number of squeezes are associated with user perceptions of affect arousal conveyed by an agent, while number of squeezes and squeeze duration are associated with affect valence. We also show that, when affect-relevant cues are present in facial display, speech prosody, and touch used simultaneously by the agent, facial display dominates user perceptions of affect valence, and facial display and prosody are associated with affect arousal, while touch had little effect. Finally, we show that when touch is used in the context of an empathic, comforting interaction (but without the manipulation of affect cues in other modalities), it can lead to better perceptions of relationship with the agent, but only for users who are comfortable being touched by other people. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

The IEEE Transactions on Affective Computing is a cross-disciplinary and international archive journal aimed at disseminating results of research on the design of systems that can recognize, interpret, and simulate human emotions and related affective phenomena. 

Full Aims & Scope

Meet Our Editors

Editor In Chief

Björn W. Schuller
Imperial College London 
Department of Computing
180 Queens' Gate, Huxley Bldg.
London SW7 2AZ, UK
e-mail: schuller@ieee.org