By Topic

Monocular vision based robot self-localization

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Jiaolong Yang ; Beijing Lab. of Intell. Inf. Technol., Beijing Inst. of Technol. Beijing, Beijing, China ; Lei Chen ; Wei Liang

In this paper, we propose a position tracking method for robot self-localization with monocular vision. The robot is able to locate itself relying solely on its onboard monocular camera, and the localization result will not be affected by odometer error caused by wheel slippage. Our approach uses the Scale Invariant Feature Transform (SIFT) for feature detection and matching over consecutive frames to compute the fundamental matrix from epipolar geometry. Robust outlier elimination technique and iterative computation are combined to improve the robustness and accuracy of the estimation result of fundamental matrix. The motion parameters, including rotation matrix and displacement direction vector, are calculated with the fundamental matrix and the pre-calibrated intrinsic parameters. A recursive displacement computation algorithm is applied to solve the displacement length for position tracking. Experiments carried out in an indoor environment demonstrate the effectiveness of our approach.

Published in:

Robotics and Biomimetics (ROBIO), 2010 IEEE International Conference on

Date of Conference:

14-18 Dec. 2010