Loading [MathJax]/extensions/MathZoom.js
Dominant SIngle-Modal SUpplementary Fusion (SIMSUF) for Multimodal Sentiment Analysis | IEEE Journals & Magazine | IEEE Xplore

Dominant SIngle-Modal SUpplementary Fusion (SIMSUF) for Multimodal Sentiment Analysis


Abstract:

Multimodal sentiment analysis remains a big challenge due to the lack of effective fusion solutions. An effective fusion is expected to obtain the correct semantic repres...Show More

Abstract:

Multimodal sentiment analysis remains a big challenge due to the lack of effective fusion solutions. An effective fusion is expected to obtain the correct semantic representation for all modalities, and simultaneously thoroughly explore the contribution of each modality. In this paper, we propose a dominant SIngle-Modal SUpplementary Fusion (SIMSUF) approach to perform effective multimodal fusion for sentiment analysis. The SIMSUF is composed of three major components, a dominant modality supplementary module, a modality enhancement module, and a multimodal fusion module. The dominant modality supplementary module realizes dominant modality determination by estimating mutual dependence between every two modalities, and then the dominant modality is adopted to supplement other modalities for representative feature learning. To further explore the modality contribution, we propose a two-branch modality enhancement module, where one branch learns common representation distribution for multiple modalities, and simultaneously a specific modality enhancement branch is presented to perform semantic difference enhancement and distribution difference enhancement for each modality. Finally, a dominant modality leading fusion module is designed to fuse multimodal representations of two branches for sentiment analysis. Extensive experiments are evaluated on the CMU-MOSEI and CMU-MOSI datasets. Experiment results certify that our approach is superior to the state-of-the-art approaches.
Published in: IEEE Transactions on Multimedia ( Volume: 26)
Page(s): 8383 - 8394
Date of Publication: 26 December 2023

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.