Skip to Main Content
To provide a means for recognition of affect from a distance, this paper analyzes the capability of gait to reveal a person's affective state. We address interindividual versus person-dependent recognition, recognition based on discrete affective states versus recognition based on affective dimensions, and efficient feature extraction with respect to affect. Principal component analysis (PCA), kernel PCA, linear discriminant analysis, and general discriminant analysis are compared to either reduce temporal information in gait or extract relevant features for classification. Although expression of affect in gait is covered by the primary task of locomotion, person-dependent recognition of motion capture data reaches 95% accuracy based on the observation of a single stride. In particular, different levels of arousal and dominance are suitable for being recognized in gait. It is concluded that gait can be used as an additional modality for the recognition of affect. Application scenarios include monitoring in high-security areas, human-robot interaction, and cognitive home environments.