Onet: Twin U-Net Architecture for Unsupervised Binary Semantic Segmentation in Radar and Remote Sensing Images | IEEE Journals & Magazine | IEEE Xplore

Onet: Twin U-Net Architecture for Unsupervised Binary Semantic Segmentation in Radar and Remote Sensing Images


Abstract:

Segmenting objects from cluttered backgrounds in single-channel images, such as marine radar echoes, medical images, and remote sensing images, poses significant challeng...Show More

Abstract:

Segmenting objects from cluttered backgrounds in single-channel images, such as marine radar echoes, medical images, and remote sensing images, poses significant challenges due to limited texture, color information, and diverse target types. This paper proposes a novel solution: the Onet, an O-shaped assembly of twin U-Net deep neural networks, designed for unsupervised binary semantic segmentation. The Onet, trained with an intensity-complementary image pair and without the need for annotated labels, maximizes the Jensen-Shannon divergence (JSD) between the densely localized features and the class probability maps. By leveraging the symmetry of U-Net, Onet subtly strengthens the dependence between dense local features, global features, and class probability maps during the training process. The design of the complementary input pair aligns with the theoretical requirement that optimizing JSD needs the class probability of negative samples to accurately estimate the marginal distribution. Compared to the current leading unsupervised segmentation methods, the Onet demonstrates superior performance in target segmentation in marine radar frames and cloud segmentation in remote sensing images. Notably, we found that Onet’s foreground prediction significantly enhances the signal-to-noise ratio (SNR) of targets amidst marine radar clutter. Onet’s source code is publicly accessible at https://github.com/joeyee/Onet.
Published in: IEEE Transactions on Image Processing ( Early Access )
Page(s): 1 - 1
Date of Publication: 23 January 2025

ISSN Information:

Funding Agency: