Loading [a11y]/accessibility-menu.js
Subject-Independent Diver Gesture Classification Using Upper Limb Movement | IEEE Journals & Magazine | IEEE Xplore
Scheduled Maintenance: On Monday, 27 January, the IEEE Xplore Author Profile management portal will undergo scheduled maintenance from 9:00-11:00 AM ET (1400-1600 UTC). During this time, access to the portal will be unavailable. We apologize for any inconvenience.

Subject-Independent Diver Gesture Classification Using Upper Limb Movement


Abstract:

This letter focuses on categorizing diver gestures by analyzing angle features extracted from the movements of their upper limbs without exploiting information encoded by...Show More

Abstract:

This letter focuses on categorizing diver gestures by analyzing angle features extracted from the movements of their upper limbs without exploiting information encoded by the hands, as is generally the case in the literature. Our approach is intended to be as generic as possible, in order to enable gesture recognition, whatever the diver's equipment, and to use the usual signs used by divers. New shallow RNN pipelines based on LSTM and GRU are proposed and evaluated with regard to a DTW-KNN deterministic baseline. For underwater gestures, a preliminary energy-based SVM separation stage is introduced to distinguish between one-arm and two-arm gestures. All classification strategies are validated using a leave-one-out protocol on a motion capture dataset comprising 14 divers performing 11 distinct gestures. The database was collected in-house with a total of 1078 individual gesture recordings. The SVM separation stage clearly improves the results, from 15% for DTW-KNN to 5% for RNNs. The best RNN leave-one-out classification accuracy is obtained for the proposed two-layer LSTM network combined with a 1D-convolution layer, and a fully connected layer, yielding a 89.5% good classification rate, compared with a 92% rate using the DTW-KNN baseline.
Published in: IEEE Robotics and Automation Letters ( Volume: 9, Issue: 11, November 2024)
Page(s): 9311 - 9318
Date of Publication: 06 September 2024

ISSN Information:

Funding Agency:


References

References is not available for this document.