By Topic

Making feature selection for human motion recognition more interactive through the use of taxonomies

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Martin Losch ; Institute of Computer Science and Engineering (CSE), University of Karlsruhe, Germany ; Sven R. Schmidt-Rohr ; Rudiger Dillmann

Human activity recognition is an essential ability for service robots and other robotic systems which interact with human beings. To be proactive, the system must be able to evaluate the current state of the user it is dealing with. Also future surveillance systems will benefit from robust activity recognition if real time constraints are met, allowing to automate tasks that have to be fulfilled by humans yet. In this paper, a novel approach for the integration of a feature selection in human motion recognition is proposed. Typically, the features are chosen with respect to the relevance of the features for the classification of the activity which shall be recognized. Our new approach extends this process by involving background knowledge about the features and active user engagement. Using taxonomies built on the complete feature set, users can be provided with an interface to guide and refine the selection process. Thereby, certain problems can be avoided which are common if noisy or small amounts of training data are used to train the system.

Published in:

RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication

Date of Conference:

1-3 Aug. 2008