Processing math: 100%
FARFusion: A Practical Roadside Radar-Camera Fusion System for Far-Range Perception | IEEE Journals & Magazine | IEEE Xplore

FARFusion: A Practical Roadside Radar-Camera Fusion System for Far-Range Perception


Abstract:

Far-range perception is essential for intelligent transportation systems. The main challenge of far-range perception is due to the difficulty of performing accurate objec...Show More

Abstract:

Far-range perception is essential for intelligent transportation systems. The main challenge of far-range perception is due to the difficulty of performing accurate object detection and tracking under far distances (e.g., > \text{150}\,\text{m}) at low cost. To cope with such challenges, deploying millimeter wave Radars and high-definition (HD) cameras, and fusing their data for joint perception has become a common practice. The key to this solution is the precise association between two types of data captured from different perspectives. Towards this goal, the first question is which plane to conduct the association, i.e., the 2D image plane or the BEV plane. We argue that the former is more suitable because the location errors of the perspective projection points are smaller at far distances and can lead to more accurate associations. Thus, we project Radar-based target locations from the BEV to the 2D plane and then associate them with camera-based object locations. Subsequently, we map the camera-based object locations to the BEV plane through inverse projection mapping (IPM) with corresponding depth information from Radar data. Finally, we engage a BEV tracking module to generate target trajectories for traffic monitoring. We devise a transformation parameters refining approach based on the depth scaling technique. We have deployed an actual testbed on an urban expressway and conducted extensive experiments for evaluation. The results show that our system can improve \text{AP}_{\text{BEV}} by 32%, and reduce the location error by \text{0.56}\,\text{m}. Our system is capable of achieving an average location accuracy of \text{1.3}\,\text{m} within the \text{500}\,\text{m} range.
Published in: IEEE Robotics and Automation Letters ( Volume: 9, Issue: 6, June 2024)
Page(s): 5433 - 5440
Date of Publication: 11 April 2024

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.