Skip to Main Content
Current approaches to determine the orientation and maintain balance of mobile robots typically rely on gyro and tilt sensor data. This paper presents an image-based sensor fusion approach using sensed data from a MEMS gyro and a digital image processing system. The approach relies on the statistical property of man-made or cultural environments to exhibit predominately more horizontal and vertical edges than oblique edges. The gyro data and statistical image data is Kalman filtered to estimate the roll angle. The system was tested both indoors and outdoors at the University of Arizona campus, and it demonstrated continuous roll angle drift correction, without prior knowledge of or training on the environment. The algorithm was then implemented in a biped walking robot to demonstrate the real-time, end-to-end proof of concept.