Skip to Main Content
The goal of the autonomous city explorer (ACE) is to navigate autonomously, efficiently and safely in an unpredictable and unstructured urban environment. To achieve this aim, an accurate localization is one of the preconditions. Due to the characteristics of our navigation environment, an elaborated visual odometry system is proposed to estimate the current position and orientation of the ACE platform. The existing algorithms of optical flow computation are experimentally evaluated and compared. The method based on pyramidal Lucas-Kanade algorithm with high-speed performance is selected. Based on the optical flow in 2D images, the camera ego-motion is estimated using image Jacobian matrix and least squares method. The kinematic model is set up to map the camera ego-motion to the robot motion. To eliminate systematic errors, a novel system calibration approach is proposed. Finally the odometry system is evaluated in experiments.