Processing math: 100%
MAVEN: A Memory Augmented Recurrent Approach for Multimodal Fusion | IEEE Journals & Magazine | IEEE Xplore

MAVEN: A Memory Augmented Recurrent Approach for Multimodal Fusion


Abstract:

Multisensory systems provide complementary information that aids many machine learning approaches in perceiving the environment comprehensively. These systems consist of ...Show More

Abstract:

Multisensory systems provide complementary information that aids many machine learning approaches in perceiving the environment comprehensively. These systems consist of heterogeneous modalities, which have disparate characteristics and feature distributions. Thus, extracting, aligning, and fusing complementary representations from heterogeneous modalities (e.g., visual, skeleton, and physical sensors) remains challenging. To address these challenges, we have used the insights from several neuroscience studies of animal multisensory systems to develop MAVEN, a memory-augmented recurrent approach for multimodal fusion. MAVEN generates unimodal memory banks comprised of spatial-temporal features and uses our proposed recurrent representation alignment approach to align and refine unimodal representations iteratively. MAVEN then utilizes a multimodal variational attention-based fusion approach to produce a robust multimodal representation from the aligned unimodal features. Our extensive experimental evaluations on three multimodal datasets suggest that MAVEN outperforms state-of-the-art multimodal learning approaches in the challenging human activity recognition task across all evaluation conditions (cross-subject, leave-one-subject-out, and cross-session). Additionally, our extensive ablation studies suggest that MAVEN significantly outperforms the feed-forward fusion-based learning models (p< 0.05). Finally, the robust performance of MAVEN in extracting complementary multimodal representation from occluded and noisy data suggests its applicability on real-world datasets.
Published in: IEEE Transactions on Multimedia ( Volume: 25)
Page(s): 3694 - 3708
Date of Publication: 01 April 2022

ISSN Information:


Contact IEEE to Subscribe

References

References is not available for this document.