Skip to Main Content
We present an algorithm for non-overlapping camera network localization using trajectory estimation. The localization refers to the extrinsic calibration of a network i.e., the recovery of relative position and orientation of each camera in the network on a common ground plane coordinate system. To this end, Kalman filtering is initially used to model the observed trajectories in each camera's field of view. This information is then used to estimate the missing trajectory information in the unobserved regions by integrating the results of forward and backward linear regression estimation from adjacent cameras. These estimated trajectories are then filtered and used to recover the relative position and orientation of the cameras by analyzing the estimated and observed exit and entry points of an object in each camera's field of view. We fix one camera as a reference and find the final configuration of the network by adjusting the remaining cameras with respect to this reference. We evaluate performance of the algorithm on both simulated and real data and compare the results with state-of-the-art approaches.