Skip to Main Content
This paper presents an integrated system that combines learning, a natural-language interface, and robotic grasping to enable the transfer of grasping skills from nontechnical users to robots. The system consists of two parts: a natural-language interface for grasping commands and a learning system. This paper focuses on the learning system and testing of the entire system in a small usability study. The learning system presented consists of two phases. In the first phase, the system learns to predict the next command, which the user is planning to issue based on command sequences recorded during previous grasping sessions. In the second phase, the system predicts the user's current state and moves the robot's gripper to the intended target endpoint to attempt to grasp the object. Using eight nontechnical users and a 5-degree-of-freedom (DOF) robot arm, a usability study was conducted to observe the impact of the learning system on user performance and satisfaction during a grasping operation. Experimental results show that the system was effective in learning users' grasping intentions, which allowed it to reduce the average time to grasp an object. In addition, participants' feedback from the usability study was generally positive toward having an adaptive robotics system that learns from their commands.