By Topic

Affective Computing and Intelligent Interaction and Workshops, 2009. ACII 2009. 3rd International Conference on

Date 10-12 Sept. 2009

Filter Results

Displaying Results 1 - 25 of 151
  • [Title page]

    Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (274 KB)  
    Freely Available from IEEE
  • Sponsors

    Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (472 KB)  
    Freely Available from IEEE
  • Foreword

    Page(s): 1 - 2
    Save to Project icon | Request Permissions | PDF file iconPDF (325 KB)  
    Freely Available from IEEE
  • Committees

    Page(s): 1 - 3
    Save to Project icon | Request Permissions | PDF file iconPDF (60 KB)  
    Freely Available from IEEE
  • Keynotes

    Page(s): 1 - 3
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (203 KB)  

    User emotional states can seriously affect user's psychomotor and decision-making capabilities. The goal of this research is to develop a system to recognize task-specific negative user affective states (e.g. fatigue and stress), and to provide the appropriate intervention to compensate performance decrement resulted from these negative states. The proposed system consists of two major components: multi-modality user state sensing, and user affect and assistance modeling. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Evaluating the consequences of affective feedback in intelligent tutoring systems

    Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (558 KB) |  | HTML iconHTML  

    The link between affect and student learning has been the subject of increasing attention. Because of the possible impacts of affective state on learning, it is a goal of many intelligent tutoring systems to attempt to control student emotional states through affective interventions. While much work has gone into improving the quality of these interventions, we are only beginning to understand the complexities of the relationships between affect, learning, and feedback. This paper investigates the consequences associated with providing affective feedback. It represents a first step toward the long-term objective of designing intelligent tutoring systems that can utilize this information for analysis of the risks and benefits of affective intervention. It reports on the results of two studies that were conducted with students interacting with affect-informed virtual agents. The studies reveal that emotion-specific risk/reward information is associated with particular affective states and suggests that future systems might leverage this information to make determinations about affective interventions. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • EEG-based emotion recognition using hybrid filtering and higher order crossings

    Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (298 KB) |  | HTML iconHTML  

    EEG-based emotion recognition is a relatively new research field in the human computer interaction area and its aim is the implementation of new algorithms that would identify and recognize emotions from EEG (electroencephalogram) signals. Towards that, a novel method is presented in this paper that employs an optimized hybrid filter, using empirical mode decomposition (EMD) and genetic algorithms (GA), in order to isolate the intrinsic mode functions (IMFs) corresponding to the plurality of the energy content of the initial signal for classification. The filtered signal is constructed by the selected IMFs and is subjected to higher order crossings (HOC) analysis for feature extraction. The final feature vector is classified into six emotion classes, i.e., happiness, anger, fear, disgust, sadness, and surprise, using quadratic discriminant analysis. The high classification performance (84.72% maximum mean classification rate) justifies the efficiency of the proposed EEG-based emotion recognition approach. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Therapy progress indicator (TPI): Combining speech parameters and the subjective unit of distress

    Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (121 KB) |  | HTML iconHTML  

    A posttraumatic stress disorder (PTSD) is a severe handicap in daily life and its treatment is complex. To evaluate the success of treatments, an objective and unobtrusive expert system was envisioned: an therapy progress indicator (TPI). Speech was considered as an excellent candidate for providing an objective, unobtrusive emotion measure. Speech of 26 PTSD patients was recorded while they participated in two reliving sessions: re-experiencing their last panic attack and their last joyful occasion. As a subjective measure, the subjective unit of distress was determined, which enabled the validation of derived speech features. A set of parameters of the speech features: signal, power, zero crossing ratio, and pitch, was found to discriminate between the two sessions. A regression model involving these parameters was able to distinguish between positive and negative distress. This model lays the foundation for an TPI for patients with PTSD, which enables objective and unobtrusive evaluations of therapies. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The effect of color on expression of joy and sadness in virtual humans

    Page(s): 1 - 7
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (3281 KB) |  | HTML iconHTML  

    For centuries artists have been exploring color to express emotions. Following this insight, the paper describes an approach to learn how to use color to influence the perception of emotions in virtual humans. First, a model of lighting and filters inspired on the visual arts is integrated with a virtual human platform to manipulate color. Next, an evolutionary model, based on genetic algorithms, is created to evolve mappings between emotions and lighting and filter parameters. A first study is, then, conducted where subjects evolve mappings for joy and sadness without being aware of the evolutionary model. In a second study, the features which characterize the mappings are analyzed. Results show that virtual human images of joy tend to be brighter, more saturated and have more colors than images of sadness. The paper discusses the relevance of the results for the fields of expression of emotions and virtual humans. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Multimodal real-time conversation analysis using a novel process engine

    Page(s): 1 - 2
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (138 KB) |  | HTML iconHTML  

    This contribution introduces a software framework enabling researchers to develop real-time pattern recognition and sensor fusion applications in an abstraction level above that of common programming languages in order to reduce and minimize programming errors and technical obstacles. Furthermore, a proof of concept using two separate instances of the process engine on different computers with audiovisual data processing is described. The scenario shows the capability of the engine to process data in realtime and synchronously on multiple machines, which are necessary features in large scale projects. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Dynamic emotion and personality synthesis

    Page(s): 1 - 2
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (332 KB) |  | HTML iconHTML  

    Emotion AI Emotion and Personality Synthesis technology is a solution to the problem of displaying expressive lifelike digital characters without the time and costs involved with traditional hand animation or motion capture. In addition these characters are fully interactive, being driven procedurally using various forms of pre processed input that are modulated through a system that simulates various neural pathways, neuro chemicals and neuro transmitters and finally models output through the various nerve pathways involved in facial expression and body posture muscle contractions. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Emotion detection in dialog systems: Applications, strategies and challenges

    Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (138 KB) |  | HTML iconHTML  

    Emotion plays an important role in human communication and therefore also human machine dialog systems can benefit from affective processing. We present in this paper an overview of our work from the past few years and discuss general considerations, potential applications and experiments that we did with the emotional classification of human machine dialogs. Anger in voice portals as well as problematic dialog situations can be detected to some degree, but the noise in real life data and the issue of unambiguous emotion definition are still challenging. Also, a dialog system reacting emotionally might raise expectations with respect to its intellectual abilities that it can not fulfill. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Affective haptics in emotional communication

    Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1383 KB) |  | HTML iconHTML  

    In the paper we are proposing a conceptually novel approach to reinforcing (intensifying) own feelings and reproducing (simulating) the emotions felt by the partner during online communication through specially designed system, iFeel_IM!. The core component, affect analysis model, automatically recognizes nine emotions from text. The detected emotion is stimulated by innovative haptic devices integrated into iFeel_IM!. The implemented system can considerably enhance emotionally immersive experience of real-time messaging. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Affect sensing in speech: Studying fusion of linguistic and acoustic features

    Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (358 KB) |  | HTML iconHTML  

    Recently, there has been considerable interest in the recognition of affect in language. In this paper, we investigate how information fusion using linguistic (lexical, stylometric, deictic) and acoustic information can be utilized for this purpose and present a comprehensive study of fusion. We examine fusion at the decision level and the feature level and discuss obtained results. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A tool for polarity classification of human affect from panel group texts

    Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (241 KB) |  | HTML iconHTML  

    We introduce an explorative tool for affect analysis from texts. Rather than the full range of emotions, feelings, and sentiment, our system is currently restricted to the positive or negative polarity of phrases and sentences. It analyses the input texts with the aid of a affect lexicon that specifies among others the prior polarity (positive or negative) of words. A chunker is used to determine phrases that are the basis for a compositional treatment of phraselevel polarity assignment. In our current experiments we focus on phrases that are targeted towards persons, be it the writer (I, my, me,.), the social group including the writer (we, our,.) or the reader (you, your,). We evaluate our system with standard data (customer reviews). We also give initial results from a small corpus of 35 texts taken from a panel group called 'I battle depression'. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Facial and vocal emotion expression of a personal computer assistant to engage, educate and motivate children

    Page(s): 1 - 7
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (349 KB) |  | HTML iconHTML  

    The general goal of our research is to develop a personal computer assistant that persuades children to adhere to a healthy lifestyle during daily activities at home. The assistant will be used in three different roles: as companion, educator and motivator. This study investigates whether the effectiveness of the computer assistant with an iCat robot embodiment, can be improved when it expresses emotions (tested for each of the three roles). It shows that emotion expressions can improve the effectiveness of the robot to achieve its role objectives. The improvements that we found are small, however, probably due to a ceiling effect: All subjective measures are rated very positively in the neutral condition, thus leaving little room for improvement. It also showed that the emotional speech was less intelligible, which may limit the robots' effectiveness. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A collaborative personalized affective video retrieval system

    Page(s): 1 - 2
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (687 KB) |  | HTML iconHTML  

    In this demonstration, a collaborative personalized affective video retrieval is introduced. A dataset of 155 video clips extracted from Hollywood movies were annotated by the emotion felt by participants. More than 1300 annotations from 40 participants were gathered in a database to be used for affective retrieval system. The retrieval system is able to retrieve videos based on emotional keyword query as well as arousal and valence query. The user's personal profile (gender, age, cultural background) was employed to improve the collaborative filtering in retrieval. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Foundations for modelling emotions in game characters: Modelling emotion effects on cognition

    Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (201 KB) |  | HTML iconHTML  

    Affective gaming has received much attention lately, as the gaming community recognizes the importance of emotion in the development of engaging games. Affect plays a key role in the user experience, both in entertainment and in `serious' games. Current focus in affective gaming is primarily on the sensing and recognition of the players' emotions, and on tailoring the game responses to these emotions. A significant effort is also being devoted to generating `affective behaviors' in the game characters, and in player avatars, to enhance their realism and believability. Less emphasis is placed on modeling emotions, both their generation and their effects, in the game characters, and in user models representing the players. This paper accompanies a tutorial presented at ACII2009, whose objective was to provide theoretical foundations for modeling emotions in game characters, as well as practical hands-on guidelines to help game developers construct functional models of emotion. While the tutorial covered models of both emotion generation and emotion effects, this paper focuses on modeling emotion effects on cognition. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Using dimensional descriptions to express the emotional content of music

    Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (278 KB) |  | HTML iconHTML  

    Dimensional descriptions are used in computational research on music and emotion, but the approach has potentially much more power and subtlety than many applications use. Two main kinds of refinement are considered-description of moment-by-moment change, and the use of more than the two or three best known dimensions. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A comparison of PCA, KPCA and LDA for feature extraction to recognize affect in gait kinematics

    Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (129 KB) |  | HTML iconHTML  

    This study investigates recognition of affect in human walking as daily motion, in order to provide a means for affect recognition at distance. For this purpose, a data base of affective gait patterns from non-professional actors has been recorded with optical motion tracking. Principal component analysis (PCA), kernel PCA (KPCA) and linear discriminant analysis (LDA) are applied to kinematic parameters and compared for feature extraction. LDA in combination with naive Bayes leads to an accuracy of 91% for person-dependent recognition of four discrete affective states based on observation of barely a single stride. Extra-success comparing to inter-individual recognition is twice as much. Furthermore, affective states which differ in arousal or dominance are better recognizable in walking. Though primary task of gait is locomotion, cues about a walker's affective state are recognizable with techniques from machine learning. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Emotion and music: A view from the cultural psychology of music

    Page(s): 1 - 3
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (140 KB) |  | HTML iconHTML  

    This paper provides an overview of current thinking in music cognition regarding the perception of emotion in music. A componential view of emotion is adopted, and a variety of routes by which music expresses emotion are presented. Two main questions for future research are identified: first, the extent to which perception and induction of emotion through music is shared cross-culturally, and second, identification of the factors that contribute to the cross-cultural perception of emotion in music. By drawing upon a biologically and ecologically informed perspective this paper aims to identify routes for future research that would enable music cognition research to shed light on the socio-historical variability of emotion perception through music. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Perception of emotional expressions in different representations using facial feature points

    Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (379 KB) |  | HTML iconHTML  

    Facial expression recognition is an enabling technology for affective computing. Many existing facial expression analysis systems rely on automatically tracked facial feature points. Although psychologists have studied emotion perception from manually specified or marker-based point-light displays, no formal study exists on the amount of emotional information conveyed through automatically tracked feature points. We assess the utility of automatically extracted feature points in conveying emotions for posed and naturalistic data and present results from an experiment that compared human raters' judgements of emotional expressions between actual video clips and three automatically generated representations of them. The implications for optimal face representation and creation of realistic animations are discussed. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • EMBR: A realtime animation engine for interactive embodied agents

    Page(s): 1 - 2
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1221 KB) |  | HTML iconHTML  

    Embodied agents can be powerful interface devices and versatile research tools for the study of emotion, gesture, facial expression etc. However, they require high effort and expertise for their creation, assembly and animation control. Therefore, open animation engines and high-level control languages are required to make embodied agents accessible to researchers and developers. In this demo paper, we present such an engine called EMBR (embodied agents behavior realizer) and its control language EMBRScript. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Does the mood matter?

    Page(s): 1 - 4
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (195 KB) |  | HTML iconHTML  

    We report the results of the experiment that examined effects of mood on search performance. Participants were asked to use Google search engine to find answers to two questions. Searchers' mood was measured using the Positive Affect and Negative Affect Scale (PANAS). Search performance was measured by number of websites visited, time spent reading search results, quality of answers and other similar measures. Analysis of relationship between the mood and search performance indicated that positive mood prior to the search affected certain search behaviors, but neither positive nor negative moods had significant effect on the quality of search results. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Understanding behavioral problems in text-based communication using neuroscientific perspective

    Page(s): 1 - 6
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (326 KB) |  | HTML iconHTML  

    In face-to-face communication, humans handle a variety of inputs in addition to the target content. Many affective clues such as facial expressions, body postures, and characteristics of speech, environmental sensory inputs, and even the mood of the interacting parties influence the overall meaning extracted from communication. However, text-based computer mediated communication (i.e., instant messaging, email, chat) generally exhibit poor media content in terms of these inputs. In particular, peers communicating through computer mediated communication (CMC) are usually prone to make wrong emotional judgments. Because of the tight connectivity of emotion and cognition, emotional judgment errors cause errors in the perception of the received message and shift behavioral preference toward fearless, disinherited, aggressive, and deceptive content in the responses. In this study, we are putting forward a cognitive neuroscience perspective to show the similarity between the behavioral problems brought by the text-based CMC platforms and cognitive and emotional behavioral problems exhibited by brain damaged patient populations. We present brief examples of behavioral deficits observed in amygdala and/or orbito-frontal cortex (OFC) damaged patients and show that these deficits bear striking similarities with those in text-based CMC platforms. While we consider ourselves to communicate similarly in face-to-face and computerized text-based environments, our brains produce dissimilar cognitive input and output in these two separate environments. Our conclusion is: when the communication problems introduced by the limited social cues in email and chat are seen in the light of the neurology perspective, developing solutions for these problems will become a priority issue. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.