Skip to Main Content
This paper proposes a new tracking algorithm which combines object and background information, via building object and background appearance models simultaneously by non-parametric kernel density estimation. The major contribution is a novel bidirectional learning framework for discrimination between the object and background. It has the following advantages: 1) it embeds background information, unlike most other methods that focus on the object only, 2) it provides a mechanism to detect occlusion and distraction, which are two main causes of tracking failure, 3) it performs feature selection, making the tracker more robust to outliers. By this learning framework, we are able to embed discriminative information into the generative appearance model. Experimental results demonstrate that the tracker is able to model drastic appearance changes and robust to occlusion and distraction.