Loading [a11y]/accessibility-menu.js
Adaptive Multi-Sensor Integrated Navigation System Aided by Continuous Error Map from RSU for Autonomous Vehicles in Urban Areas | IEEE Conference Publication | IEEE Xplore

Adaptive Multi-Sensor Integrated Navigation System Aided by Continuous Error Map from RSU for Autonomous Vehicles in Urban Areas


Abstract:

There has been a widespread study on multi-sensor integration to achieve precise and robust odometry for autonomous vehicles (AVs) in urban areas. LiDAR odometry and visu...Show More

Abstract:

There has been a widespread study on multi-sensor integration to achieve precise and robust odometry for autonomous vehicles (AVs) in urban areas. LiDAR odometry and visual odometry can be affected by structureless scenarios and numerous dynamic objects. GNSS positioning can be degenerated due to the multipath and non-line-of-sight signals by buildings. Therefore, selecting appropriate weighing for heterogeneous sensors is a challenge for multi-sensor fusion. With the advancements in cellular vehicle-to-everything (C-V2X) and intelligent roadside units (RSUs), vehicles and the RSUs can collaborate to deliver reliable service. Inspired by this, this paper investigates continuous error maps for available sensors under different time conditions (noon, sunset, and night) to improve the positioning performance of surrounding AVs in complex urban environments. In particular, this paper presents an error-map-aided multi-sensor integrated system, which benefits from the error information collected by a sensor-rich AV. Then the error information is uploaded to the RSUs which is then distributed to the AVs. A smaller weight is assigned if a larger error is queried from the error map. To validate our approach, experiments were performed using the realistic CARLA simulator and our self-developed GNSS RUMS simulator. To benefit the research community, we open-sourced the implementation on our project page33https://sites.google.com!view/v2x-cooperative-navigation..
Date of Conference: 24-28 September 2023
Date Added to IEEE Xplore: 13 February 2024
ISBN Information:

ISSN Information:

Conference Location: Bilbao, Spain

I. Introduction

Recent advances in autonomous systems show the great potential for smart mobility. Simultaneous localization and mapping (SLAM) are [1] fundamental for most autonomous systems. Three-dimensional (3D) light detection and ranging (LiDAR) provides dense 3D point clouds of the surrounding information, which is widely utilized to provide position and map solutions [2] for autonomous systems. However, the performance of LiDAR-based odometry can be affected by the numerous dynamic obj ects [3] and the structureless environments [4]. Visual odometry [5] is a popular technique to provide state estimation by feature matching. But the performance of the visual-based method is sensitive to illumination conditions and the available features [6]. The global navigation satellite system (GNSS) provides absolute positioning services. Unfortunately, its performance can be degenerated due to the non-line-of-sight (NLOS) and multipath [7]. A single sensor is hard to meet the reliable navigation requirements for AV, thus multi -sensor integration has received significant attention because of their complementary and redundancy.

Top: Illustration of the error map broadcast through the roadside unit (RSU). AVI is equipped with a sensor-rich (e.g., LiDAR, camera, GNSS, and high-end devices which can provide ground truth positioning) to evaluate the sensor error periodically, while the AV2 and AV3 are the autonomous vehicles that receive the error map information to aid their navigation with their available sensors. Bottom: The collected RGB images day and night.

Contact IEEE to Subscribe

References

References is not available for this document.