Skip to Main Content
A novel method is proposed for matching articulated objects in cluttered videos. The method needs only a single exemplar image of the target object. Instead of using a small set of large parts to represent an articulated object, the proposed model uses hundreds of small units to represent walks along paths of pixels between key points on an articulated object. Matching directly on dense pixels is key to achieving reliable matching when motion blur occurs. The proposed method fits the model to local image properties, conforms to structure constraints, and remembers the steps taken along a pixel path. The model formulation handles variations in object scaling, rotation and articulation. Recovery of the optimal pixel walks is posed as a special shortest path problem, which can be solved efficiently via dynamic programming. Further speedup is achieved via factorization of the path costs. An efficient method is proposed to find multiple walks and simultaneously match multiple key points. Experiments show that the proposed method is efficient and reliable and can be used to match articulated objects in fast motion videos with strong clutter and blurry imagery.