By Topic

Video object tracking with feedback of performance measures

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Erdem, C.E. ; Dept. of Electr. & Electron. Eng., Bogazici Univ., Istanbul, Turkey ; Tekalp, A.M. ; Sankur, B.

Presents a scalable object tracking framework, which is capable of tracking the contour of nonrigid objects in the presence of occlusion. The framework consists of open-loop boundary prediction and closed-loop boundary correction parts. The open-loop prediction block adaptively divides the object contour into subcontours, and estimates the mapping parameters for each subsegment. The closed-loop boundary correction block employs a suitably weighted combination of low-level features such as color edge, color segmentation, motion models, and motion segmentation for each subcontour. Performance evaluation measures are used in a feedback loop to evaluate the goodness of the segmentation/tracking in order to adjust the weights assigned to each of these low-level features for each subcontour at each frame. The framework is scalable because it can be adapted to track a coarse estimate of the boundary of selected objects in real-time, as well as pixel-accurate boundary tracking in off-line mode. The proposed method does not depend on any single motion or shape model, and does not need training. Experimental results demonstrate that the algorithm is able to track the object boundaries under significant occlusion and background clutter.

Published in:

Circuits and Systems for Video Technology, IEEE Transactions on  (Volume:13 ,  Issue: 4 )