Loading [MathJax]/extensions/MathZoom.js
A Visual-Inertial Localization Method for Unmanned Aerial Vehicle in Underground Tunnel Dynamic Environments | IEEE Journals & Magazine | IEEE Xplore

A Visual-Inertial Localization Method for Unmanned Aerial Vehicle in Underground Tunnel Dynamic Environments


An overview of our proposed localization method.

Abstract:

Unmanned Aerial Vehicles (UAVs) can significantly improve the autonomy of the mining industry, and self-localization is the key to autonomous flights of underground UAVs....Show More

Abstract:

Unmanned Aerial Vehicles (UAVs) can significantly improve the autonomy of the mining industry, and self-localization is the key to autonomous flights of underground UAVs. A localization method of visual-inertial sensor data fusion is proposed in this paper. The method aims to improve the localization accuracy and robustness of underground UAVs in dynamic environments. First, an algorithm for dynamic point detection and rejection is presented, which combines a semantic segmentation neural network, an optical flow method, and an epipolar constraint method. Second, a visual-inertial sensor fusion algorithm is used to enhance performance in areas lacking static visual features. It can also provide absolute scales to the localization results, as opposed to monocular systems. Finally, a hand-held multi-sensor data collection system is developed with accurate calibration, for imitating flights of underground UAVs and easing data collection in real underground tunnels. We evaluate our proposed localization method and compare it with state-of-art method VINS-Mono on both the public EuRoC dataset and our own collected data in underground tunnels. Experimental results show that the proposed visual-inertial localization method can improve the accuracy by more than 67% over VINS-Mono in high dynamic environments, and it can be applied to underground dynamic scenes with high robustness and accuracy.
An overview of our proposed localization method.
Published in: IEEE Access ( Volume: 8)
Page(s): 76809 - 76822
Date of Publication: 22 April 2020
Electronic ISSN: 2169-3536

Funding Agency:


References

References is not available for this document.