By Topic

Fusion between laser and stereo vision data for moving objects tracking in intersection like scenario

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Baig, Q. ; INRIA Rhone-Alpes, Univ. of Grenoble 1, Grenoble, France ; Aycard, O. ; Trung Dung Vu ; Fraichard, T.

Using multiple sensors in the context of environment perception for autonomous vehicles is quite common these days. Perceived data from these sensors can be fused at different levels like: before object detection, after object detection and finally after tracking the moving objects. In this paper we detail our object detection level fusion between laser and stereo vision sensors as opposed to pre-detection or track level fusion. We use the output of our laser processing to get a list of objects with position and dynamic properties for each object. Similarly we use the stereo vision output of another team which consists of a list of detected objects with position and classification properties for each object. We use Bayesian fusion technique on objects of these two lists to get a new list of fused objects. This fused list of objects is further used in tracking phase to track moving objects in an intersection like scenario. The results obtained on data sets of INTERSAFE-2 demonstrator vehicle show that this fusion has improved data association and track management steps.

Published in:

Intelligent Vehicles Symposium (IV), 2011 IEEE

Date of Conference:

5-9 June 2011