By Topic

Emotion classification based on forehead biosignals using support vector machines in music listening

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Naji, M. ; Dept. of Biomed. Eng., Islamic Azad Univ., Tehran, Iran ; Firoozabadi, M. ; Azadfallah, P.

The purpose of this study was to investigate the feasibility of using forehead biosignals as informative channels for classification of music-induced emotions. Classification of four emotional states in Arousal-Valence space was performed by employing two parallel support vector machines as arousal and valence classifiers. Relative powers of EEG sub-bands, spectral entropy, mean power frequency, and higher order crossings were extracted from each of the three forehead data channels: left Temporalis, Frontalis, and right Temporalis. The inputs of the classifiers were obtained by a feature selection algorithm based on a fuzzy-rough model. The averaged subject-independent classification accuracy of 93.80%, 92.43%, and 86.67% for arousal classification, valence classification, and classification of four emotional states in Arousal-Valence space, respectively, is achieved.

Published in:

Bioinformatics & Bioengineering (BIBE), 2012 IEEE 12th International Conference on

Date of Conference:

11-13 Nov. 2012