Scheduled System Maintenance:
Some services will be unavailable Sunday, March 29th through Monday, March 30th. We apologize for the inconvenience.
By Topic

Multi-camera tracking by joint calibration, association and fusion

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

The purchase and pricing options are temporarily unavailable. Please try again later.
2 Author(s)
Siyue Chen ; Dept. of Electr. & Comput. Eng., Univ. of Calgary, Calgary, AB, Canada ; Leung, H.

To perform surveillance using multiple cameras, camera calibration, measurement-to-object association, fusion of measurements from multiple cameras are three essential components. While these three issues are usually addressed separately, they actually have mutual effects on each other. For example, calibration requires correctly associated objects and measurements with calibration errors will result in wrong associations. In this paper, we present a novel joint calibration, association and fusion approach for multi-camera tracking. More specifically, the expectation-maximization (EM) algorithm is incorporated with the extended Kalman filter (EKF) to give a simultaneous estimate of object states, calibration and association parameters. The real video data collected from two cameras are used to evaluate the tracking performance of the proposed method. Compared to the conventional methods, which perform calibration, association and fusion separately, it is shown that the proposed method can significantly improve the robustness and the accuracy of multi-object tracking.

Published in:

Distributed Smart Cameras (ICDSC), 2011 Fifth ACM/IEEE International Conference on

Date of Conference:

22-25 Aug. 2011