By Topic

Affective Computing, IEEE Transactions on

Issue 2 • Date April-June 2013

Filter Results

Displaying Results 1 - 10 of 10
  • Data-Free Prior Model for Facial Action Unit Recognition

    Publication Year: 2013 , Page(s): 127 - 141
    Cited by:  Papers (5)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (1945 KB) |  | HTML iconHTML  

    Facial action recognition is concerned with recognizing the local facial motions from image or video. In recent years, besides the development of facial feature extraction techniques and classification techniques, prior models have been introduced to capture the dynamic and semantic relationships among facial action units. Previous works have shown that combining the prior models with the image me... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Detecting Depression Severity from Vocal Prosody

    Publication Year: 2013 , Page(s): 142 - 150
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (586 KB) |  | HTML iconHTML  

    To investigate the relation between vocal prosody and change in depression severity over time, 57 participants from a clinical trial for treatment of depression were evaluated at seven-week intervals using a semistructured clinical interview for depression severity (Hamilton Rating Scale for Depression (HRSD)). All participants met criteria for major depressive disorder (MDD) at week one. Using bo... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • DISFA: A Spontaneous Facial Action Intensity Database

    Publication Year: 2013 , Page(s): 151 - 160
    Cited by:  Papers (11)
    Multimedia
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (1649 KB) |  | HTML iconHTML  

    Access to well-labeled recordings of facial expression is critical to progress in automated facial expression recognition. With few exceptions, publicly available databases are limited to posed facial behavior that can differ markedly in conformation, intensity, and timing from what occurs spontaneously. To meet the need for publicly available corpora of well-labeled video, we collected, ground-tr... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • EEG-Based Classification of Music Appraisal Responses Using Time-Frequency Analysis and Familiarity Ratings

    Publication Year: 2013 , Page(s): 161 - 172
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (1168 KB) |  | HTML iconHTML  

    A time-windowing feature extraction approach based on time-frequency (TF) analysis is adopted here to investigate the time-course of the discrimination between musical appraisal electroencephalogram (EEG) responses, under the parameter of familiarity. An EEG data set, formed by the responses of nine subjects during music listening, along with self-reported ratings of liking and familiarity, is use... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Emotional Responses to Victory and Defeat as a Function of Opponent

    Publication Year: 2013 , Page(s): 173 - 182
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (860 KB) |  | HTML iconHTML  

    The experiment with 33 participants showed that the social relationship between players (playing a first-person shooter game against a friend or a stranger, and in single-player mode) influences phasic emotion-related psychophysiological responses to digital game events representing victory and defeat. Irrespective of opponent type, a defeat elicited increasing positive affect and decreasing negat... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Exploring Cross-Modality Affective Reactions for Audiovisual Emotion Recognition

    Publication Year: 2013 , Page(s): 183 - 196
    Cited by:  Papers (3)
    Multimedia
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (1702 KB)  

    Psycholinguistic studies on human communication have shown that during human interaction individuals tend to adapt their behaviors mimicking the spoken style, gestures, and expressions of their conversational partners. This synchronization pattern is referred to as entrainment. This study investigates the presence of entrainment at the emotion level in cross-modality settings and its implications ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • HED: A Computational Model of Affective Adaptation and Emotion Dynamics

    Publication Year: 2013 , Page(s): 197 - 210
    Multimedia
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (1506 KB)  

    Affective adaptation is the process of weakening of the affective response of a constant or repeated affective stimulus by psychological processes. A modified exponentially weighted average computational model of affective adaptation, which predicts its time course and the resulting affective dynamics, is presented. In addition to capturing the primary features of affective adaptation, it is shown... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Porting Multilingual Subjectivity Resources across Languages

    Publication Year: 2013 , Page(s): 211 - 225
    Multimedia
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (639 KB)  

    Subjectivity analysis focuses on the automatic extraction of private states in natural language. In this paper, we explore methods for generating subjectivity analysis resources in a new language by leveraging on the tools and resources available in English. Given a bridge between English and the selected target language (e.g., a bilingual dictionary or a parallel corpus), the methods can be used ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Positive Affective Interactions: The Role of Repeated Exposure and Copresence

    Publication Year: 2013 , Page(s): 226 - 237
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (1073 KB) |  | HTML iconHTML  

    We describe and evaluate a new interface to induce positive emotions in users: a digital, interactive adaptive mirror. We study whether the induced affect is repeatable after a fixed interval (Study 1) and how copresence influences the emotion induction (Study 2). Results show that participants systematically feel more positive after an affective mirror session, that this effect is repeatable, and... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Using a Smartphone to Measure Heart Rate Changes during Relived Happiness and Anger

    Publication Year: 2013 , Page(s): 238 - 241
    Cited by:  Papers (2)
    Multimedia
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (84 KB) |  | HTML iconHTML  

    This study demonstrates the feasibility of measuring heart rate (HR) differences associated with emotional states such as anger and happiness with a smartphone. Novice experimenters measured higher HRs during relived anger and happiness (replicating findings in the literature) outside a laboratory environment with a smartphone app that relied on photoplethysmography. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

The IEEE Transactions on Affective Computing is a cross-disciplinary and international archive journal aimed at disseminating results of research on the design of systems that can recognize, interpret, and simulate human emotions and related affective phenomena. 

Full Aims & Scope

Meet Our Editors

Editor In Chief

Björn W. Schuller
Imperial College London 
Department of Computing
180 Queens' Gate, Huxley Bldg.
London SW7 2AZ, UK
e-mail: schuller@ieee.org