By Topic

Continuous functional activity monitoring based on wearable tri-axial accelerometer and gyroscope

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

5 Author(s)
Yuting Zhang ; Department of Electrical and Computer Engineering, Boston University, Boston, USA ; Stacey Markovic ; Inbal Sapir ; Robert C. Wagenaar
more authors

Given the growing number of elderly people and patients diagnosed with Parkinson's disease, monitoring functional activities using wearable wireless sensors can be used to promote the Quality of Life and healthier life styles. We propose a novel and practical solution using three small wearable wireless Functional Activity Monitor (FAM) sensors and a smartphone to store, transmit, analyze and update data. Three sensors, each composed of a tri-axial accelerometer and a tri-axial gyroscope, are attached to the chest and both thighs. A computationally efficient signal processing algorithm is designed to accurately measure tilting angles. A continuous activity recognition algorithm is developed using a decision tree based on time series data and spectrum analysis; this algorithm can identify activities of daily life in three general categories: (1) postures such as standing, sitting, and lying; (2) locomotion such as walking; and (3) transitions such as sit-to-stand and stand-to-sit. The results show an accurate angle measurement compared to the motion capture system Optotrak 3020 and a reliable detection of all activities with sensitivity at least 96.2% compared to video recordings.

Published in:

2011 5th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth) and Workshops

Date of Conference:

23-26 May 2011