Exploring Multimodal Multiscale Features for Sentiment Analysis Using Fuzzy-Deep Neural Network Learning | IEEE Journals & Magazine | IEEE Xplore

Exploring Multimodal Multiscale Features for Sentiment Analysis Using Fuzzy-Deep Neural Network Learning


Abstract:

Sentiment analysis, a challenging task in understanding human emotions expressed through diverse modalities, prompts the development of innovative solutions. Multimodal d...Show More

Abstract:

Sentiment analysis, a challenging task in understanding human emotions expressed through diverse modalities, prompts the development of innovative solutions. Multimodal data often contains important complementary information. Effective fusion and extraction of multimodal data features are key issues in sentiment analysis. In this article, we introduce a novel sentiment analysis model that integrates multimodal multiscale features based on a fuzzy-deep neural network. First, we combine multimodal data, namely text, audio, and images, to extract intrinsic feature representations. Second, our model incorporates the fuzzy-deep neural network learning module, infused with fuzzy logic principles to enhance adaptability to the inherent vagueness in sentiment expressions. Furthermore, we integrate the dual attention mechanism that dynamically focuses on pivotal aspects within multimodal data, refining feature extraction for heightened context-awareness. Rigorous validation across three datasets, including the Multimodal Corpus of Sentiment Intensity dataset, the Multimodal Opinion Sentiment and Emotion Intensity dataset, and the Chinese Single and Multimodal Sentiment dataset, demonstrates the model's superior performance in capturing the intricacies of human emotions.
Published in: IEEE Transactions on Fuzzy Systems ( Volume: 33, Issue: 1, January 2025)
Page(s): 28 - 42
Date of Publication: 26 June 2024

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.