By Topic

Self-localization of omni-directional mobile robot using silicon retina based optical flow sensor

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Sanada, A. ; Dept. of Brain Sci. & Eng., Inst. of Technol., Hukuoka, Japan ; Ishii, K. ; Yagi, T.

The omni-directional mobile robot has advantage such that three degree of freedom motion in 2D plane can be set independently and can move in arbitrary direction with keeping same heading direction. With regards to self-localization using onboard sensors, dead reckoning is often used by means of measuring the wheel's velocities (motor encoder data) as well as car-like robots. However, omni-directional mobile robots are easy to slip by the nature of omni-wheels with many free rollers, dead reckoning will not work even if one wheel does not attached on ground. A method to measure odometry data which is not affected by wheel's slip should be introduced to acquire enough quality of self location for omni-directional mobile robots. We describe a method to obtain robot ego-motion using camera images and optical flow calculation, that is, the camera is used as a velocity sensor. In this paper, a silicon retina vision camera developed by Yagi et al. is introduced as a mobile robot sensor, which has good dynamic range against various lighting condition. And a FPGA optical flow circuit for the silicon retina is developed to measure ego-motion of mobile robot. The developed optical flow calculation system is introduced into a small omni-directional mobile robot and evaluation experiments using the mobile robot ego-motion are carried out. In the experiments, the accuracy of self-location by the dead reckoning and optical flow are evaluated by comparing with motion capture. The results show that the self-location obtained by the optical flow corresponds with motion capture data.

Published in:

World Automation Congress (WAC), 2010

Date of Conference:

19-23 Sept. 2010