Loading [a11y]/accessibility-menu.js
Contrastive Self-supervised Learning for Sensor-based Human Activity Recognition | IEEE Conference Publication | IEEE Xplore

Contrastive Self-supervised Learning for Sensor-based Human Activity Recognition


Abstract:

Deep Learning models, applied to a sensor-based Human Activity Recognition task, usually require vast amounts of annotated time-series data to extract robust features. Ho...Show More

Abstract:

Deep Learning models, applied to a sensor-based Human Activity Recognition task, usually require vast amounts of annotated time-series data to extract robust features. However, annotating signals coming from wearable sensors can be a tedious and, often, not so intuitive process, that requires specialized tools and predefined scenarios, making it an expensive and time-consuming task. This paper combines one of the most recent advances in Self-Supervised Leaning (SSL), namely a SimCLR framework, with a powerful transformer-based encoder to introduce a Contrastive Self-supervised learning approach to Sensor-based Human Activity Recognition (CSSHAR) that learns feature representations from unlabeled sensory data. Extensive experiments conducted on three widely used public datasets have shown that the proposed method outperforms recent SSL models. Moreover, CSSHAR is capable of extracting more robust features than the identical supervised transformer when transferring knowledge from one dataset to another as well as when very limited amounts of annotated data are available.
Date of Conference: 04-07 August 2021
Date Added to IEEE Xplore: 20 July 2021
ISBN Information:

ISSN Information:

Conference Location: Shenzhen, China

References

References is not available for this document.