By Topic

Dynamic gestures as an input device for directing a mobile platform

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Ehreumann, M. ; Inst. for Process Control & Robotics, Karlsruhe Univ., Germany ; Lutticke, T. ; Dillmann, R.

Giving an advice to a mobile robot still requires classical user interfaces. A more intuitive way of commanding can be provided by verbal or gesture commands. In this article, we present new approaches and enhancements for established methods that are in use in our laboratory. Our aim is to direct a robot with simple dynamic gestures. We focus on visual gesture recognition. Based on skin color segmentation algorithms for tracking the user's hand, hidden Markov models are used for gesture type recognition. The filters applied to the recorded trajectory strongly compress the input data. They also mark start and end point of a possible gesture. The hidden Markov models have been enhanced by a threshold model in order to wipe out insignificant movements. Pre-classification of the reference gestures serves for keeping computational effort low.

Published in:

Robotics and Automation, 2001. Proceedings 2001 ICRA. IEEE International Conference on  (Volume:3 )

Date of Conference: