Loading [MathJax]/extensions/MathMenu.js
Robust Unsupervised Arousal Rating:A Rule-Based Framework withKnowledge-Inspired Vocal Features | IEEE Journals & Magazine | IEEE Xplore

Robust Unsupervised Arousal Rating:A Rule-Based Framework withKnowledge-Inspired Vocal Features


Abstract:

Studies in classifying affect from vocal cues have produced exceptional within-corpus results, especially for arousal (activation or stress); yet cross-corpora affect rec...Show More

Abstract:

Studies in classifying affect from vocal cues have produced exceptional within-corpus results, especially for arousal (activation or stress); yet cross-corpora affect recognition has only recently garnered attention. An essential requirement of many behavioral studies is affect scoring that generalizes across different social contexts and data conditions. We present a robust, unsupervised (rule-based) method for providing a scale-continuous, bounded arousal rating operating on the vocal signal. The method incorporates just three knowledge-inspired features chosen based on empirical and theoretical evidence. It constructs a speaker’s baseline model for each feature separately, and then computes single-feature arousal scores. Lastly, it advantageously fuses the single-feature arousal scores into a final rating without knowledge of the true affect. The baseline data is preferably labeled as neutral, but some initial evidence is provided to suggest that no labeled data is required in certain cases. The proposed method is compared to a state-of-the-art supervised technique which employs a high-dimensional feature set. The proposed framework achieveshighly-competitive performance with additional benefits. The measure is interpretable, scale-continuous as opposed to discrete, and can operate without any affective labeling. An accompanying Matlab tool is made available with the paper.
Published in: IEEE Transactions on Affective Computing ( Volume: 5, Issue: 2, 01 April-June 2014)
Page(s): 201 - 213
Date of Publication: 30 May 2014

ISSN Information:

PubMed ID: 25705327

Contact IEEE to Subscribe

References

References is not available for this document.