Skip to Main Content
Cameras are often a good choice as the primary outward-looking sensor for mobile robots, and a wide field of view is usually desirable for responsive and accurate navigation, SLAMand relocalisation. While this can potentially be provided by a single omnidirectional camera, it can also be flexibly achieved by multiple cameras with standard optics mounted around the robot. However, such setups are difficult to calibrate. Here we present a general method for fully automatic extrinsic auto-calibration of a fixed multi camera rig, with no requirement for calibration patterns or other infrastructure, which works even in the case where the cameras have completely non-overlapping views. The robot is placed in a natural environment and makes a set of programmed movements including a full horizontal rotation and captures a synchronized image sequence from each camera. These sequences are processed individually with a monocular visual SLAM algorithm. The resulting maps are matched and fused robustly based on corresponding invariant features, and then all estimates are optimised full joint bundle adjustment, where we constrain the relative poses of the cameras to be fixed. We present results showing accurate performance of the method for various two and four camera configurations.