Skip to Main Content
This paper presents a system developed in the SmartTracking project to simultaneously estimate the egomotion of a mobile platform and the structure of the environment in which the platform moves. This is required in applications such as robot navigation and augmented reality (AR) to overlay virtual information correctly. The egomotion estimation is achieved by integrating visual and inertial sensor data. The structure estimation is based on the detection of corner features in the environment. From a single known starting position, the system can move into an unknown environment. To enable fast robotics applications the system uses specially developed CMOS cameras that operate at a rate of 2000 image windows per second. The vision and inertial data are fused with an extended Kalman filter. The filter is designed to handle asynchronous input from these two sensors, which typically operate at different and possibly varying rates. Additionally, a bank of filters is used to estimate the quality of structure points and to include them into the structure estimation process. The system is demonstrated on a set-up with known ground truth, such that the motion from a known reference template into a new unknown environment can be shown.