Skip to Main Content
We propose a complete solution to robust and accurate object tracking in face of various types of occlusions which pose many challenges to correct judgment of occlusion situation and proper update of target template. In order to tackle those challenges, we first propose a content-adaptive progressive occlusion analysis (CAPOA) algorithm. By combining the information provided by spatiotemporal context, reference target, and motion constraints together, the algorithm makes a clear distinction between the target and outliers. Accurate tracking of an occluded target is achieved by rectifying the target location using the variant-mask template matching (VMTM). In order to deal with template drift in the process of template update, we propose a drift-inhibitive masked Kalman appearance filter (DIMKAF) which accurately evaluates the influence of template drift when updating the masked template. Finally, we devise a local best match authentication (LBMA) algorithm to handle complete occlusions, so that we can achieve a much more trustworthy detection of the end of an arbitrarily long complete occlusion. Experimental results show that our proposed solution tracks targets reliably and accurately no matter when they are under: short-term, long-term, partial or complete occlusions.