Skip to Main Content
This paper presents an automatic approach for emotion recognition from a bimodal system based on facial expressions and physiological signals. The information fusion is to combine information from both modalities. We tested two approaches, one based on mutual information which allows the selection of relevant information, the second approach is based on principal component analysis that allows the transformation of data into another space. The obtained results using both modalities are better compared to the separate use of each modality.