By Topic

Omnidirectional visual homing using the 1D trifocal tensor

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Aranda, M. ; DIIS - I3A, Univ. of Zaragoza, Zaragoza, Spain ; Lopez-Nicolas, G. ; Sagues, C.

This paper presents a new method for visual homing to be used on a robot moving on the ground plane. A relevant issue in vision-based navigation is the field-of-view constraints of conventional cameras. We overcome this problem by means of omnidirectional vision and we propose a vision-based homing control scheme that relies on the 1D trifocal tensor. The technique employs a reference set of images of the environment previously acquired at different locations and the images taken by the robot during its motion. In order to take advantage of the qualities of omnidirectional vision, we define a purely angle-based approach, without requiring any distance information. This approach, taking the planar motion constraint into account, motivates the use of the 1D trifocal tensor. In particular, the additional geometric constraints enforced by the tensor improve the robustness of the method in the presence of mismatches. The interest of our proposal is that the designed control scheme computes the robot velocities only from angular information, being this very precise information; in addition, we present a procedure that computes the angular relations between all the views even if they are not directly related by feature matches. The feasibility of the proposed approach is supported by the stability analysis and the results from simulations and experiments with real images.

Published in:

Robotics and Automation (ICRA), 2010 IEEE International Conference on

Date of Conference:

3-7 May 2010