Skip to Main Content
Tracking of regions in image sequences plays a fundamental role in applications (search and retrieval in video databases, object based coding such as in MPEG-4, surveillance), and although numerous approaches to region tracking have been developed, they all suffer from severe constraints imposed on the nature of the image sequence. Some assume a particular motion model or constrain the range of interframe motion, while others constrain both the region tracked and the background to have uniform and contrasting intensities. As a result, these tracking algorithms become byproducts of algorithms for motion or intensity boundary detection, and thus have limited applicability. We propose a novel algorithm for region tracking that uses the Bayesian framework for tracking previously developed. We extend this framework by re-expressing tracking in terms of Kullback-Leibler divergence of specific probability distributions and generalizing these to empirical distributions computed over image neighborhoods, leading to level set equations in terms of local image statistics. The main novelty of our proposed algorithm is that contrary to other tracking algorithms which are expressed as level set PDEs, the motion is not assumed to be small, nor is the background assumed to be stationary, nor is the region supposed to be uniform and have strong contrast with the background. We illustrate the performance of our algorithm on real image sequences with natural motion.