By Topic

Mobile robot ego motion estimation using RANSAC-based ceiling vision

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

6 Author(s)
Han Wang ; Sch. of Electr. & Electron. Eng., Nanyang Technol. Univ., Singapore, Singapore ; Wei Mou ; Minh Hiep Ly ; Lau, M.W.S.
more authors

Visual odometry is a commonly used technique for recovering motion and location of the robot. In this paper, we present a robust visual odometry estimation approach based on ceiling view from 3D camera (Kinect). We extracted Speedup Robust Features (SURF) from the monocular image frames retrieved from the camera. SURF features from two consecutive frames are matched by finding the nearest neighbor using KD-tree. 3D information of the SURF features are retrieved using the camera's depth map. The 3D affine transformation model is estimated between these two frames based on Random Sample Consensus (RANSAC) method. All inliers are then used to reestimate the relative transformation between two frames by Singular Value Decomposition (SVD). Given this, the global robot position and orientation can be calculated. Experimental results demonstrate the performance of the proposed algorithm in real environments.

Published in:

Control and Decision Conference (CCDC), 2012 24th Chinese

Date of Conference:

23-25 May 2012