Loading [MathJax]/extensions/TeX/ietmacros.js
Real-Time Dynamic Visual-Inertial SLAM and Object Tracking Based on Lightweight Deep Feature Extraction Matching | IEEE Journals & Magazine | IEEE Xplore

Real-Time Dynamic Visual-Inertial SLAM and Object Tracking Based on Lightweight Deep Feature Extraction Matching


Abstract:

To mitigate the heavy reliance on semantic information and the unreliability of manual feature extraction in dynamic simultaneous localization and mapping (SLAM) and obje...Show More

Abstract:

To mitigate the heavy reliance on semantic information and the unreliability of manual feature extraction in dynamic simultaneous localization and mapping (SLAM) and object tracking systems, a novel visual-inertial SLAM with deep feature extraction matching is proposed. A lightweight network is designed for feature extraction and description, replacing the oriented FAST and rotated BRIEF (ORB) approach, which can address deep learning latency and the shortcomings of manual extraction. A fast object tracking method is developed for extracting and associating dynamic objects based on clustering analyses of feature point pair distances to epipolar lines, which enables real-time tracking of numerous objects without semantic data. An advanced online incremental loop closure detector for deep features is designed, which supersedes ORB-based detectors and maintains global pose optimization. The system’s effectiveness and advantages, including its proficiency in real-time embedded platform execution and enhanced self-localization and dynamic object tracking, have been demonstrated through extensive evaluations. Notably, the system is capable of consistently tracking dynamic objects with rapid movement or weak texture, furnishing the localization system with robust dynamic constraints.
Article Sequence Number: 5013922
Date of Publication: 27 February 2025

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.