Skip to Main Content
Recently, the covariance region descriptor has been proved robust and versatile for a modest computational cost. It enables efficient fusion of different types of features. Based on the covariance descriptor and the metric on Riemannian manifolds, we develop a robust Bayesian tracking framework via fragments-based representation in this paper. In this framework, the template object is represented by multiple image fragments or patches. Every patch votes on the possible state of the object in the current frame, by comparing its covariance descriptor with the corresponding image patch model. Tracking is then led by the Bayesian state inference framework in which a particle filter is used for propagating sample distributions over time. The weight of each particle is formulated by combining the votes of the patches using a robust statistic. Further, we extend the fast covariance computation to the Bayesian tracking problem, which makes the tracking procedure more efficient. We present extensive experimental results on challenging sequences, which demonstrate the robust tracking achieved by our algorithm.