Loading [MathJax]/extensions/MathMenu.js
Feature-Representation Transfer Learning for Human Activity Recognition | IEEE Conference Publication | IEEE Xplore

Feature-Representation Transfer Learning for Human Activity Recognition


Abstract:

Many human-centered intelligent systems require information about the activities being performed by the user for the systems to function optimally. Human activity recogni...Show More

Abstract:

Many human-centered intelligent systems require information about the activities being performed by the user for the systems to function optimally. Human activity recognition (HAR) is at the core of such systems. Activity recognition requires vast amounts of labeled training data to perform adequately under a variety of circumstances. The lack of enough labeled training data led to transfer learning (TL), a phenomenon that uses knowledge learned from one task's dataset to easily perform a different task. In this paper, we show how TL can be used to improve the recognition of human activities with a small number of data samples. Using a convolutional neural network - long short-term memory (CNN - LSTM) deep learning ensemble classifier, we show how features learned from activities with motion enable us to easily learn features of stationery activities, even with a small dataset. TL improved model generalizability and reduced overfitting. To evaluate the performance, we used the UCI HAR dataset that contains 6 activities which was split into two sub tasks. The accuracy increased by over 4% whereas the loss decreased by 30% between a base model and the TL model. We also present the opportunities that using TL presents to the field of human activity recognition where new activities that have very small amounts of training data can be learned using data from already existing datasets.
Date of Conference: 16-18 October 2019
Date Added to IEEE Xplore: 27 December 2019
ISBN Information:
Print on Demand(PoD) ISSN: 2162-1233
Conference Location: Jeju, Korea (South)

Contact IEEE to Subscribe

References

References is not available for this document.