Skip to Main Content
This work presents a method for matching partially overlapping image-pairs where the object of interest is in motion, even if the motion is discontinuous and in an unstructured environment. In a typical outdoor multicamera surveillance system, an observed object as seen by separate cameras may appear very different, due to the variable influence of factors such as lighting conditions and camera angles. Thus static features such as object color, shape, and contours cannot be used for image matching. In this paper a different method is proposed for matching partially overlapping images captured by such cameras. The matching is achieved by calculation of co-motion statistics, followed by detection and rejection of points outside the overlap area and a nonlinear optimization process. The robust algorithm we describe finds point correspondences in two images without searching for any structures and without the need for tracking continuous motion. Trials using statistical motion-based image cross-registration, a robust rejection algorithm, and automatic 3D image-transformation and camera calibration on real-life outdoor images have demonstrated the feasibility of this approach.