By Topic

Orthogonal wall correction for visual motion estimation

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Jorg Stuckler ; Computer Science Institute, Albert-Ludwigs-University of Freiburg, 79110, Germany ; Sven Behnke

A good motion model is a prerequisite for many approaches to simultaneous localization and mapping. Without an absolute reference, it is however difficult to prevent drift when estimating motion. To prevent orientation drift, our approach exploits typical features of indoor environments: Straight walls that are parallel or orthogonal to each other. Our idea is to detect walls in monocular depth measurements and to correct odometry obtained from matching successive images and from inertial measurements, such that the observed walls are aligned with the main orientation estimated from the map that is being built. The experimental results indicate that orientation drift can be prevented and orientation uncertainty can be reduced greatly when applying the proposed orthogonal wall correction. This can make the difference between reliable mapping and failure.

Published in:

Robotics and Automation, 2008. ICRA 2008. IEEE International Conference on

Date of Conference:

19-23 May 2008