Loading [MathJax]/extensions/MathMenu.js
Dataset on underwater change detection | IEEE Conference Publication | IEEE Xplore

Dataset on underwater change detection


Abstract:

The detection of moving objects in a scene is a well researched but depending on the concrete research still often a challenging computer vision task. Usually it is the f...Show More

Abstract:

The detection of moving objects in a scene is a well researched but depending on the concrete research still often a challenging computer vision task. Usually it is the first step in a whole pipeline and all following algorithms (tracking, classification etc.) are dependent on the accuracy of the detection. Hence, a good pixel-precise segmentation of the objects of interest is mandatory for many applications. However, the underwater environment has mostly been neglected so far and there exists no common dataset to evaluate different algorithms under the harsh underwater conditions and therefore a comprehensive evaluation is impossible. In this paper, we present an underwater change detection dataset consisting of five videos and hundreds of handsegmented groundtruth images as well as a survey of different underwater image enhancement techniques and their impact on segmentation algorithms.
Date of Conference: 19-23 September 2016
Date Added to IEEE Xplore: 01 December 2016
ISBN Information:
Conference Location: Monterey, CA, USA

1. Introduction

The most popular approach to create a pixelwise foreground-background segregation is change detection. This is because changes in a static scene (produced e.g. by a mounted surveillance camera) correspond to moving objects and they are generally the interesting parts of the scene. Non-static scenes, produced by a moving camera, or single images are far more challenging to handle and usually require object specific algorithms that have to be trained beforehand, for example neural networks or hear-like features. Although they can be used on any scene, the need for a learning phase for each object limits their usability.

Contact IEEE to Subscribe

References

References is not available for this document.