Loading [MathJax]/extensions/MathMenu.js
Feature Extraction and Selection for Emotion Recognition from Electrodermal Activity | IEEE Journals & Magazine | IEEE Xplore

Feature Extraction and Selection for Emotion Recognition from Electrodermal Activity


Abstract:

Electrodermal activity (EDA) is indicative of psychological processes related to human cognition and emotions. Previous research has studied many methods for extracting E...Show More

Abstract:

Electrodermal activity (EDA) is indicative of psychological processes related to human cognition and emotions. Previous research has studied many methods for extracting EDA features; however, their appropriateness for emotion recognition has been tested using a small number of distinct feature sets and on different, usually small, data sets. In the current research, we reviewed 25 studies and implemented 40 different EDA features across time, frequency and time-frequency domains on the publicly available AMIGOS dataset. We performed a systematic comparison of these EDA features using three feature selection methods, Joint Mutual Information (JMI), Conditional Mutual Information Maximization (CMIM) and Double Input Symmetrical Relevance (DISR) and machine learning techniques. We found that approximately the same numbers of features are required to obtain the optimal accuracy for the arousal recognition and the valence recognition. Also, the subject-dependent classification results were significantly higher than the subject-independent classification for both arousal and valence recognition. Statistical features related to the Mel-Frequency Cepstral Coefficients (MFCC) were explored for the first time for the emotion recognition from EDA signals and they outperformed all other feature groups, including the most commonly used Skin Conductance Response (SCR) related features.
Published in: IEEE Transactions on Affective Computing ( Volume: 12, Issue: 4, 01 Oct.-Dec. 2021)
Page(s): 857 - 869
Date of Publication: 26 February 2019

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.