Loading [MathJax]/extensions/MathMenu.js
Dyn-DarkSLAM: YOLO-based Visual SLAM in Low-light Conditions | IEEE Conference Publication | IEEE Xplore

Dyn-DarkSLAM: YOLO-based Visual SLAM in Low-light Conditions


Abstract:

Although ORB-SLAM3 performs well on dynamic datasets after integrating with the YOLO algorithm, its high demand for lighting conditions limits its applicability in unmann...Show More

Abstract:

Although ORB-SLAM3 performs well on dynamic datasets after integrating with the YOLO algorithm, its high demand for lighting conditions limits its applicability in unmanned aerial vehicles (UAVs). To change this, we propose a novel visual off-line SLAM system called DynDarkSLAM, which significantly improves the localization and tracking performance in low-light and dynamic conditions by combining ORB-SLAM3 with GAN networks and YOLOv5. Built upon ORB-SLAM3, Dyn-DarkSLAM leverages its strong visual localization capability and integrates GAN networks to preprocess low-light images and enhance their lighting conditions, thus enhancing the system’s robustness in weak lighting environments. In addition, YOLOv5 enables real-time object detection and semantic segmentation, further expanding ORB-SLAM3’s application scope. Evaluation results on publicly available datasets with added low-light noise and publicly available 3D modeling low-light datasets demonstrate that Dyn-DarkSLAM effectively reduces trajectory errors while maintaining high computational efficiency, providing longer tracking time than existing methods in extremely low-light conditions. These characteristics make Dyn-DarkSLAM an ideal choice for applications such as autonomous drone patrols.
Date of Conference: 21-23 July 2024
Date Added to IEEE Xplore: 01 October 2024
ISBN Information:
Conference Location: Tianjin, China

Contact IEEE to Subscribe

References

References is not available for this document.