Skip to Main Content
In this paper, we propose a position tracking method for robot self-localization with monocular vision. The robot is able to locate itself relying solely on its onboard monocular camera, and the localization result will not be affected by odometer error caused by wheel slippage. Our approach uses the Scale Invariant Feature Transform (SIFT) for feature detection and matching over consecutive frames to compute the fundamental matrix from epipolar geometry. Robust outlier elimination technique and iterative computation are combined to improve the robustness and accuracy of the estimation result of fundamental matrix. The motion parameters, including rotation matrix and displacement direction vector, are calculated with the fundamental matrix and the pre-calibrated intrinsic parameters. A recursive displacement computation algorithm is applied to solve the displacement length for position tracking. Experiments carried out in an indoor environment demonstrate the effectiveness of our approach.