By Topic

Vision and Laser Data Fusion for Tracking People with a Mobile Robot

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Bellotto, N. ; Dept. of Comput. Sci., Univ. of Essex, Colchester ; Huosheng Hu

In this paper we present a multi-sensor fusion system for tracking people with a mobile robot, which integrates the information provided by a laser range sensor and a PTZ camera. We introduce the algorithms used for detecting legs from laser scans and faces from video images, then we illustrate a human motion model for the estimation of people position, orientation and height. The ego-motion of the robot is also taken into account and the information fused using an implementation of the unscented Kalman filter. Finally, multiple human tracks are generated and maintained thanks to an appropriate data association procedure. The results of several experiments are illustrated, proving the effectiveness of our approach, and some considerations drawn.

Published in:

Robotics and Biomimetics, 2006. ROBIO '06. IEEE International Conference on

Date of Conference:

17-20 Dec. 2006