Abstract:
This paper presents a gesture-based human robot interaction system. Gestures are recognized by segmenting the three largest components based on skin color segmentation an...Show MoreMetadata
Abstract:
This paper presents a gesture-based human robot interaction system. Gestures are recognized by segmenting the three largest components based on skin color segmentation and PCA based pattern-matching techniques. Gesture commands are generated and issued - whenever the combinations of three skin-like regions at a particular image frame match with the predefined gesture. These gesture commands are sent to a robot through a frame-based software platform and robot acts in accordance with the predefined task for those gestures. In this paper a method has also been proposed to detect left hand and right hand relative to face position, as well as, to detect the face and locate its position. The effectiveness of this method has been demonstrated over the interaction with a pet robot named AIBO for eight operations.
Published in: 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583)
Date of Conference: 10-13 October 2004
Date Added to IEEE Xplore: 07 March 2005
Print ISBN:0-7803-8566-7
Print ISSN: 1062-922X