Skip to Main Content
We present an efficient and accurate object tracking algorithm based on the concept of graph cut segmentation. The ability to track visible objects in real-time provides an invaluable tool for the implementation of markerless Augmented Reality. Once an object has been detected, it's location in future frames can be used to position virtual content, and thus annotate the environment. Unlike many object tracking algorithms, our approach does not rely on a preexisting 3D model or any other information about the object or its environment. It takes, as input, a set of pixels representing an object in an initial frame and uses a combination of optical flow and graph cut segmentation to determine the corresponding pixels in each future frame. Experiments show that our algorithm robustly tracks objects of disparate shapes and sizes over hundreds of frames, and can even handle difficult cases where an object contains many of the same colors as its background. We further show how this technology can be applied to practical AR applications.
Date of Conference: 13-16 Nov. 2007