By Topic

A unified framework for tracking through occlusions and across sensor gaps

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

5 Author(s)
Kaucic, R. ; Gen. Electr. Global Res., Niskayuna, NY, USA ; Perera, A.G.A. ; Brooksby, G. ; Kaufhold, J.
more authors

A common difficulty encountered in tracking applications is how to track an object that becomes totally occluded, possibly for a significant period of time. Another problem is how to associate objects, or tracklets, across non-overlapping cameras, or between observations of a moving sensor that switches fields of regard. A third problem is how to update appearance models for tracked objects over time. As opposed to using a comprehensive multi-object tracker that must simultaneously deal with these tracking challenges, we present a novel, modular framework that handles each of these problems in a unified manner by the initialization, tracking, and linking of high-confidence tracklets. In this track/suspend/match paradigm, we first analyze the scene to identify areas where tracked objects are likely to become occluded. Tracking is then suspended on occluded objects and re-initiated when they emerge from behind the occlusion. We then associate, or match, suspended tracklets with the new tracklets using full kinematic models for object motion and Gibbsian distributions for object appearance in order to complete the track through the occlusion. Sensor gaps are handled in a similar manner, where tracking is suspended when the sensor looks away and then re-initiated when the sensor returns. Changes in object appearance and orientation during tracking are also seamlessly handled in this framework. Tracklets with low lock scores are terminated. Tracking then resumes on untracked movers with corresponding updated appearance models. These new tracklets are then linked back to the terminated ones as appropriate. Fully automatic tracking results from a moving sensor are presented.

Published in:

Computer Vision and Pattern Recognition, 2005. CVPR 2005. IEEE Computer Society Conference on  (Volume:1 )

Date of Conference:

20-25 June 2005