By Topic

Vision-based navigation and environmental representations with an omnidirectional camera

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
J. Gaspar ; Inst. de Sistemas e Robotica, Inst. Superior Tecnico, Lisbon, Portugal ; N. Winters ; J. Santos-Victor

Proposes a method for the visual-based navigation of a mobile robot in indoor environments, using a single omnidirectional (catadioptric) camera. The geometry of the catadioptric sensor and the method used to obtain a bird's eye (orthographic) view of the ground plane are presented. This representation significantly simplifies the solution to navigation problems, by eliminating any perspective effects. The nature of each navigation task is taken into account when designing the required navigation skills and environmental representations. We propose two main navigation modalities: topological navigation and visual path following. Topological navigation is used for traveling long distances and does not require knowledge of the exact position of the robot but rather, a qualitative position on the topological map. The navigation process combines appearance based methods and visual servoing upon some environmental features. Visual path following is required for local, very precise navigation, e.g., door traversal, docking. The robot is controlled to follow a prespecified path accurately, by tracking visual landmarks in bird's eye views of the ground plane. By clearly separating the nature of these navigation tasks, a simple and yet powerful navigation system is obtained

Published in:

IEEE Transactions on Robotics and Automation  (Volume:16 ,  Issue: 6 )