By Topic

Visual Protractor Based Localization Algorithm for Mobile Robot

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Wei Chen ; Coll. of Mechantronics & Autom., National Univ. of Defense Technol., Changsha ; Xuening Wang ; Tao Wu ; Xin Xu

Data fusion is an effective tool to improve the localization precision of a mobile robot. This paper presents a new algorithm to improve the precision of robot localization by fusing data from three kinds of sensors: encoders, a gyroscope and a camera. In the data fusion process, the camera is used as a visual protractor, which is a simple and robust way to process vision information but plays an important role in reducing the errors of localization in high slip-rate cases. An experiment of mobile robot localization is designed to test the efficiency of the algorithm. In this experiment, the localization precision of robots can be improved almost 400% when visual protractor is introduced

Published in:

Intelligent Systems Design and Applications, 2006. ISDA '06. Sixth International Conference on  (Volume:3 )

Date of Conference:

16-18 Oct. 2006