Skip to Main Content
The omni-directional mobile robot has advantage such that three degree of freedom motion in 2D plane can be set independently and can move in arbitrary direction with keeping same heading direction. With regards to self-localization using onboard sensors, dead reckoning is often used by means of measuring the wheel's velocities (motor encoder data) as well as car-like robots. However, omni-directional mobile robots are easy to slip by the nature of omni-wheels with many free rollers, dead reckoning will not work even if one wheel does not attached on ground. A method to measure odometry data which is not affected by wheel's slip should be introduced to acquire enough quality of self location for omni-directional mobile robots. We describe a method to obtain robot ego-motion using camera images and optical flow calculation, that is, the camera is used as a velocity sensor. In this paper, a silicon retina vision camera developed by Yagi et al. is introduced as a mobile robot sensor, which has good dynamic range against various lighting condition. And a FPGA optical flow circuit for the silicon retina is developed to measure ego-motion of mobile robot. The developed optical flow calculation system is introduced into a small omni-directional mobile robot and evaluation experiments using the mobile robot ego-motion are carried out. In the experiments, the accuracy of self-location by the dead reckoning and optical flow are evaluated by comparing with motion capture. The results show that the self-location obtained by the optical flow corresponds with motion capture data.
World Automation Congress (WAC), 2010
Date of Conference: 19-23 Sept. 2010