Loading web-font TeX/Main/Regular
Fusion of Inertial and High-Resolution Acoustic Data for Privacy-Preserving Human Activity Recognition | IEEE Journals & Magazine | IEEE Xplore

Fusion of Inertial and High-Resolution Acoustic Data for Privacy-Preserving Human Activity Recognition


Abstract:

Multimodal human activity recognition (HAR) offers significant advantages over single-modality approaches, particularly in the recently discussed fusion of inertial and a...Show More

Abstract:

Multimodal human activity recognition (HAR) offers significant advantages over single-modality approaches, particularly in the recently discussed fusion of inertial and acoustic data. However, audio information often contains sensitive personal information. While some studies have focused on audio privacy protection, speech frequencies (below 8 kHz) can still be potentially reconstructed using deep learning techniques. This article presents a novel approach to protect audio privacy in multimodal HAR by utilizing low-cost microphones to extract high-resolution (Hi-res) audio and filtering sensitive information at both nonspeech ( 8\sim 96 kHz) and inaudible ( 20\sim 96 kHz) levels. We collected a dataset of 20 comprehensive daily activities from 15 participants using custom hardware, with ground truth built from video evidence. Building on this foundation, this article proposes a new hybrid-attention-based HAR method, which leverages self-attention (SA) for extracting salient features in both the temporal and latent space domains, as well as cross-attention (CA) for exploring intermodal relationships. According to the evaluation on the collected dataset, the proposed method demonstrates significant performance improvements over single-modality approaches and outperforms common direct concatenation fusion methods. In addition, inaudible ultrasonic frequencies have demonstrated the ability to differentiate certain activities, making them effective for multimodal fusion in scenarios with strict privacy requirements.
Article Sequence Number: 9519320
Date of Publication: 30 April 2025

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.