By Topic

Collision Sensing by Stereo Vision and Radar Sensor Fusion

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

5 Author(s)
Shunguang Wu ; Vision Technol., Sarnoff Corp., Princeton, NJ, USA ; Decker, S. ; Peng Chang ; Camus, T.
more authors

To take advantage of both stereo cameras and radar, this paper proposes a fusion approach to accurately estimate the location, size, pose, and motion information of a threat vehicle with respect to a host one from observations that are obtained by both sensors. To do that, we first fit the contour of a threat vehicle from stereo depth information and find the closest point on the contour from the vision sensor. Then, the fused closest point is obtained by fusing radar observations and the vision closest point. Next, by translating the fitted contour to the fused closest point, the fused contour is obtained. Finally, the fused contour is tracked by using rigid body constraints to estimate the location, size, pose, and motion of the threat vehicle. Experimental results from both synthetic data and real-world road test data demonstrate the success of the proposed algorithm.

Published in:

Intelligent Transportation Systems, IEEE Transactions on  (Volume:10 ,  Issue: 4 )