By Topic

Multi-Sensor Fusion Tracking Using Visual Information and WI-Fl Location Estimation

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Miyaki, T. ; Tokyo Univ., Tokyo ; Yamasaki, T. ; Aizawa, K.

In a widely distributed camera network environment, object tracking is a important function for a surveillance application. This paper describes an object tracking scheme using sensor fusion approach which is composed of visual information from cameras and location information based on Wi-Fi location estimation system. Estimated location information is calculated by a set of received signal strength values of beacon packets from Wi-Fi access points (APs) around the targets. Different from the conventional approaches which use another kind of sensors (e.g., global positioning system (GPS), pressure sensors on the floors, laser-range scanners, etc.), our approach can cover wider areas both indoor and outdoor with lower cost. Particle filter is applied to combine those two different kinds of sensory input to achieve tracking the target in a stable performance. Wi-Fi observation model is involved in a conventional visual particle filtering scheme to evaluate importance weights of each particle. In this paper, we present experimental results applied to outdoor surveillance camera environment.

Published in:

Distributed Smart Cameras, 2007. ICDSC '07. First ACM/IEEE International Conference on

Date of Conference:

25-28 Sept. 2007