Abstract:
This paper reports results from preliminary experiments on automatic classification of spoken affect valence. The task was to classify short spoken sentences into one of ...Show MoreMetadata
Abstract:
This paper reports results from preliminary experiments on automatic classification of spoken affect valence. The task was to classify short spoken sentences into one of two classes: approving or disapproving. Using an optimal combination of six acoustic measurements our classifier achieved an accuracy of 65% to 88% for speaker dependent, text-independent classification. The results suggest that pitch and energy measurements may be used to automatically classify spoken affect valence but more research will be necessary to understand individual variations and how to broaden the range of affect classes which can be recognized. In a second experiment we compared human performance in classifying the same speech samples. We found similarities between human and automatic classification results.
Published in: Proceedings of the Second International Conference on Automatic Face and Gesture Recognition
Date of Conference: 14 Quarterly 1996 - 16 October 1996
Date Added to IEEE Xplore: 06 August 2002
Print ISBN:0-8186-7713-9