Skip to Main Content
Visual tracking could be formulated as a state estimation problem of target representation based on observations in image sequences. Approaching visual tracking problem in the Bayesian filter framework, how to sample the state evolution model to generate hypothesis of high confidence level is a critical factor. In this paper, we introduce an interacting multiple model estimation (IMME) framework for adaptive visual tracking. The essence of the IMME framework is that the state is estimated by integrating several different models in parallel and by interacting among those models' estimates probabilistically. Based on the IMME framework, we propose a new variation of particle filter named interacting multiple model particle filter (IMMPF), in which the hypotheses can be sampled from several different state evolution models adaptively. Experiments show that, when compared with the standard particle filter, the IMMPF generates better hypotheses resulting in better tracking results, especially when the target behaves along several motion modes randomly.