By Topic

Distibuted human tracking in smart camera networks by adaptive particle filtering and data fusion

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Fatemeh Rezaei ; Department of Electrical Engineering, Sharif University of Technology, Iran ; Babak H. Khalaj

Human tracking is an essential step in many computer vision-based applications. As single view tracking may not be sufficiently robust and accurate, tracking based on multiple cameras has been widely considered in recent years. This paper presents a distributed human tracking method in a smart camera network and introduces a particle filter design based on Histogram of Oriented Gradients (HOG) and color histogram. The proposed adaptive motion model also estimates the target speed from the history of its latest displacement and improves the robustness of the tracker by decreasing the probability of missing targets. In addition, a distributed data fusion method is proposed which fuses the information from the cameras by an adaptive weighted average method. Each camera sends its own beliefs of the targets' states and the corresponding weights to other cameras in its communication range. The target fusion weights are determined by each camera, based on the certainty of the corresponding view for each target and an occlusion indicator which depends on the distance between detected targets. The performance of the proposed scheme is evaluated using the PETS2009 S2.L1 dataset. It is shown that the proposed data fusion method leads to more robust tracking among multiple cameras and improves handling of uncertainties and occlusions using multi-view information. In addition, the amount of data transferred in the network is significantly reduced in comparison with centralized methods.

Published in:

Distributed Smart Cameras (ICDSC), 2012 Sixth International Conference on

Date of Conference:

Oct. 30 2012-Nov. 2 2012