Skip to Main Content
We propose a novel 3D model-based framework for tracking 3D human motion in cluttered environment through an animatable 3D geometrical human model that resembles the subject, and which is textured with its real appearance color. Our computation synthesizes the 3D posture that minimizes the difference between the image of the synthesized movement and the real image via a numerical minimization kernel. The ill-posed problems in existing methods that heavily rely on standard image segmentation such as the background subtraction are overcome with our approach. Also in order to produce better 3D geometrical model for tracking, we proposed a three-filter set for large improvements on surface distortions of a low-cost human reconstruction method. Our results demonstrate that our method is able to cope with clutters and occlusions.