Skip to Main Content
This paper addresses the problem of object tracking from visual and infrared videos captured either by a dynamic or stationary camera where objects contain large pose changes. We propose a novel object tracking scheme that exploits the geometrical structure of Riemannian manifold and piecewise geodesics under a Bayesian framework. Two particle filters are alternatingly employed for tracking dynamic objects. One for online learning object appearances on Riemannian manifolds using tracked candidates, another for tracking object bounding box parameters with appearances on the manifold embedded. The rationale for obtaining this enhanced manifold tracker as compared with existing ones is to introduce an additional state variable, such that not only the manifold point representing the object is updated, but also the velocity of dynamic manifold point is estimated. Main contributions of the paper include: (a) propose an online appearance learning strategy by a particle filter on the manifold; (b) an object tracker that incorporates the manifold appearance for prediction under a particle filter framework; (c) use partitioned sub-regions of object bounding box that incorporates the spatial information in the appearance; (d) use Gabor features in different frequencies and orientations in partitioned sub-regions for IR (infrared) video objects. Hence, the proposed tracking scheme is applicable to both visual and IR videos. Experiments on videos where objects contain significant pose changes show very robust tracking results. The proposed scheme is also compared with two most relevant manifold tracking methods, results have shown much improved tracking performance in terms of tracking drift and tightness and accuracy of tracked boxes.