Skip to Main Content
This paper presents a platform to implement and evaluate a learning by imitation framework which enables humanoid robots to learn hand gestures from human beings. A marker based system is used to capture human motion data. From this data we extract the shoulder and elbow joint angles, which uniquely characterize a particular hand gesture. The proposed imitation learning framework aims to generalize over multiple demonstrations of the same hand gesture and thus learn it. The set of joint angle trajectories used for training are first aligned temporally using Dynamic Time Warping (DTW) and then generalized by weighted averaging. The framework operates in the joint space. The algorithm has been implemented and tested on the Nao Humanoid robot. We also propose a novel method to evaluate the proposed imitation learning framework. We place markers on the robot's arm analogous to the placement of markers on the human subject's arm, and then compare the respective joint angle trajectories.
Date of Conference: 20-23 June 2010