Skip to Main Content
The position and orientation estimation problem for mobile robots is approached by fusing measurements from inertial sensors, wheel encoders, and a camera. The sensor fusion approach is based on the standard extended Kalman filter, which is modified to handle measurements from the camera with unknown prior delay. A real-time implementation is done on a four-wheeled omni-directional mobile robot, using a dynamic model with 11 states. The algorithm is analyzed and validated with simulations and experiments.