Scheduled System Maintenance:
Some services will be unavailable Sunday, March 29th through Monday, March 30th. We apologize for the inconvenience.
By Topic

Unifying configuration space and sensor space for vision-based motion planning

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Sharma, R. ; Beckman Inst. for Adv. Sci. & Technol., Illinois Univ., Urbana, IL, USA ; Sutanto, H.

Visual feedback can play a crucial role in a dynamic robotic task such as the interception of a moving target. To utilize the feedback effectively, there is a need to develop robot motion planning techniques that also take into account properties of the sensed data. We propose a motion planning framework that achieves this with the help of a space called the perceptual control manifold (PCM) defined on the product of the robot configuration space and an image-based feature space. We show how the task of intercepting a moving target can be mopped to the PCM. This leads to the generation of motion plans that satisfy various constraints and optimality criteria derived from the robot kinematics, the control system, and the sensing mechanism

Published in:

Robotics and Automation, 1996. Proceedings., 1996 IEEE International Conference on  (Volume:4 )

Date of Conference:

22-28 Apr 1996