By Topic

Toward automatic robot instruction from perception-mapping human grasps to manipulator grasps

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Sing Bing Kang ; Res. Lab., Digital Equipment Corp., Cambridge, MA, USA ; Ikeuchi, K.

Our approach of programming a robot is by direct human demonstration. The system observes a human performing the task, recognizes the human grasp, and maps it onto the manipulator. This paper describes how an observed human grasp can be mapped to that of a given general-purpose manipulator for task replication. Planning the manipulator grasp based upon the observed human grasp is done at two levels: the functional and physical levels. Initially, at the functional level, grasp mapping is achieved at the virtual finger level; the virtual finger is a group of fingers acting against an object surface in a similar manner. Subsequently, at the physical level, the geometric properties of the object and manipulator are considered in fine-tuning the manipulator grasp. Our work concentrates on power or enveloping grasps and the fingertip precision grasps. We conclude by showing an example of an entire programming cycle from human demonstration to robot execution

Published in:

Robotics and Automation, IEEE Transactions on  (Volume:13 ,  Issue: 1 )