Skip to Main Content
The automotive and videogame industries have driven the cost of Micro-Electro-Mechanical System (MEMS) accelerometers and gyroscopes down to the range of just a few dollars. Likewise, the personal computer and cell phone industries have driven the cost of relatively high-resolution camera chips down to comparably low levels. Due to these cost reductions, a low-cost robot navigation system using a combination of vision and inertial sensors would be an inexpensive and effective method of navigating without the aid of GPS signals. Vision and inertial sensors are ideally suited to work together because their error characteristics are complementary. MEMS accelerometers and gyroscopes are capable of tracking high-speed motions, but suffer from long-term drifts that make them impractical to use as standalone navigation sensors. However, using Bayesian estimation methods, vision information from a stereo camera pair can be used to correct these drift errors. Therefore, using both vision and inertial sensors in tandem can produce an accurate navigation system that can operate indoors or in other areas where GPS signals or other navigation aids are unavailable. In this paper, we present two data analysis methods that can be used to calibrate and characterize the noise that is produced by MEMS-based IMUs.