By Topic

Tracking Nonstationary Visual Appearances by Data-Driven Adaptation

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Ming Yang ; NEC Labs. America, Inc., Cupertino, CA ; Zhimin Fan ; Jialue Fan ; Ying Wu

Without any prior about the target, the appearance is usually the only cue available in visual tracking. However, in general, the appearances are often nonstationary which may ruin the predefined visual measurements and often lead to tracking failure in practice. Thus, a natural solution is to adapt the observation model to the nonstationary appearances. However, this idea is threatened by the risk of adaptation drift that originates in its ill-posed nature, unless good data-driven constraints are imposed. Different from most existing adaptation schemes, we enforce three novel constraints for the optimal adaptation: (1) negative data, (2) bottom-up pair-wise data constraints, and (3) adaptation dynamics. Substantializing the general adaptation problem as a subspace adaptation problem, this paper presents a closed-form solution as well as a practical iterative algorithm for subspace tracking. Extensive experiments have demonstrated that the proposed approach can largely alleviate adaptation drift and achieve better tracking results for a large variety of nonstationary scenes.

Published in:

Image Processing, IEEE Transactions on  (Volume:18 ,  Issue: 7 )