Loading [MathJax]/extensions/MathMenu.js
Caption Generation for Sensing-Based Activity Using Attention-Based Learning Models | IEEE Journals & Magazine | IEEE Xplore

Caption Generation for Sensing-Based Activity Using Attention-Based Learning Models


Abstract:

In recent years, sensing systems have been extensively used for motion detection, activity detection, and gesture recognition, among a few other important applications. W...Show More

Abstract:

In recent years, sensing systems have been extensively used for motion detection, activity detection, and gesture recognition, among a few other important applications. Wearable sensors, such as smartwatches and smartphones, contain accelerometers and gyroscope sensors that sense a user's movements and activities to detect abnormal events. Inspired by recent breakthroughs in neural machine translation and the generation of image descriptions, we propose a first-of-its-kind novel attention-based encoder–decoder model to generate a caption to summarize various activities detected for a period from smartphone sensor data. The proposed model architecture consists of three layers: 1) the bidirectional long short-term memory (BiLSTM) incorporates both past and future information from the raw sensor data, then generates features; 2) an attention mechanism is used to assign different weights depending on the feature importance; and 3) an LSTM layer is used generate a sequence of activities, and then, a caption generator module is used to generate the caption. The performance of the proposed model is evaluated using two widely used public datasets (UCI-HAR and WISDM) and one experimental dataset. The model achieves good accuracy on UCI-HAR and our experimental dataset compared to WISDM dataset. The proposed model is able to achieve an average word error rate of 8.20%, accuracy of 90.75% with the UCI-HAR dataset, and an average word error rate of 10%, as well as accuracy of 90% with our experimental dataset.
Published in: IEEE Sensors Letters ( Volume: 8, Issue: 3, March 2024)
Article Sequence Number: 5500604
Date of Publication: 26 December 2023
Electronic ISSN: 2475-1472

Contact IEEE to Subscribe

References

References is not available for this document.