Loading [MathJax]/extensions/MathMenu.js
Semantic Cameras for 360-Degree Environment Perception in Automated Urban Driving | IEEE Journals & Magazine | IEEE Xplore

Semantic Cameras for 360-Degree Environment Perception in Automated Urban Driving


Abstract:

The European UP-Drive project addresses transportation-related challenges by providing key contributions that enable fully automated vehicle navigation and parking in com...Show More

Abstract:

The European UP-Drive project addresses transportation-related challenges by providing key contributions that enable fully automated vehicle navigation and parking in complex urban areas, which results in a safer, inclusive, affordable and environmentally friendly transportation system. For this purpose, the project consortium developed a prototype electrical vehicle equipped with cameras and LiDARs sensors that is capable to autonomously drive around the city and find available parking spots. In UP-Drive, we created an accurate, robust and redundant multi-modal environment perception system that provides 360° coverage around the vehicle. This paper summarizes the work of the project related to the surround view semantic perception using fisheye and narrow field-of-view semantic virtual cameras. Deep learning-based semantic, instance and panoptic segmentation networks, which satisfy requirements in accuracy and efficiency have been developed and integrated into the final prototype. The UP-Drive automated vehicle has been successfully demonstrated in urban areas after extensive experiments and numerous field tests.
Published in: IEEE Transactions on Intelligent Transportation Systems ( Volume: 23, Issue: 10, October 2022)
Page(s): 17271 - 17283
Date of Publication: 14 March 2022

ISSN Information:

Funding Agency:


References

References is not available for this document.