By Topic

Sensor-based roadmaps for motion planning for articulated robots in unknown environments: some experiments with an eye-in-hand system

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Yong Yu ; Sch. of Eng. Sci., Simon Fraser Univ., Burnaby, BC, Canada ; K. Gupta

We present a real implemented “eye-in-hand” test-bed system for sensor-based collision-free motion planning for articulated robot arms. The system consists of a PUMA 560 with a triangulation based area-scan laser range finder (the eye) mounted on its wrist. The framework for our planning approach was presented in Yu and Gupta (1998). It is inspired by motion planning research and incrementally builds a roadmap that represents the connectivity of the free configuration space, as it senses the physical environment. We present some experimental results with our sensor-based planner running on this real test-bed. The robot is started in completely unknown and cluttered environments. Typically, the planner is able to reach (planning as it senses) the goal configuration in about 7-25 scans (depending on the scene complexity), while avoiding collisions with the obstacles throughout

Published in:

Intelligent Robots and Systems, 1999. IROS '99. Proceedings. 1999 IEEE/RSJ International Conference on  (Volume:3 )

Date of Conference: