Abstract:
Human activity recognition (HAR) technology is increasingly utilized in domains such as security surveillance, nursing home monitoring, and health assessment. The integra...Show MoreMetadata
Abstract:
Human activity recognition (HAR) technology is increasingly utilized in domains such as security surveillance, nursing home monitoring, and health assessment. The integration of multi-sensor data improves recognition efficiency and the precision of behavioral analysis by offering a more comprehensive view of human activities. However, challenges arise due to the diversity of data types, dimensions, sampling rates, and environmental disturbances, which complicate feature extraction and data fusion. To address these challenges, we propose a HAR approach that fuses millimeter-wave radar and inertial navigation data using bimodal neural networks. We first design a comprehensive data acquisition framework that integrates both radar and inertial navigation systems, with a focus on ensuring time synchronization. The radar data undergoes range compression, moving target indication (MTI), short-time Fourier transforms (STFT), and wavelet transforms to reduce noise and improve quality and stability. The inertial navigation data is refined through moving average filtering and hysteresis compensation to enhance accuracy and reduce latency. Next, we introduce the Radar-Inertial Navigation Multi-modal Fusion Attention (T-C-RIMFA) model. In this model, a Convolutional Neural Network (CNN) processes the 1D inertial navigation data for feature extraction, while a channel attention mechanism prioritizes features from different convolutional kernels. Simultaneously, a Vision Transformer (ViT) interprets features from radar-derived micro-Doppler images. Experimental results demonstrate significant improvements in HAR tasks, achieving an accuracy of 0.988. This approach effectively leverages the strengths of both sensors, enhancing the accuracy and robustness of HAR systems.
Published in: IEEE Journal of Microwaves ( Volume: 5, Issue: 2, March 2025)