By Topic

Real-Time Visual SLAM for Autonomous Underwater Hull Inspection Using Visual Saliency

Sign In

Full text access may be available.

To access full text, please use your member or institutional sign in.

The purchase and pricing options are temporarily unavailable. Please try again later.
2 Author(s)
Ayoung Kim ; Dept. of Naval Archit. & Marine Eng., Univ. of Michigan, Ann Arbor, MI, USA ; Eustice, R.M.

This paper reports a real-time monocular visual simultaneous localization and mapping (SLAM) algorithm and results for its application in the area of autonomous underwater ship hull inspection. The proposed algorithm overcomes some of the specific challenges associated with underwater visual SLAM, namely, limited field of view imagery and feature-poor regions. It does so by exploiting our SLAM navigation prior within the image registration pipeline and by being selective about which imagery is considered informative in terms of our visual SLAM map. A novel online bag-of-words measure for intra and interimage saliency are introduced and are shown to be useful for image key-frame selection, information-gain-based link hypothesis, and novelty detection. Results from three real-world hull inspection experiments evaluate the overall approach, including one survey comprising a 3.4-h/2.7-km-long trajectory.

Published in:

Robotics, IEEE Transactions on  (Volume:29 ,  Issue: 3 )