Loading [a11y]/accessibility-menu.js
CLEAR-Net: A Cascaded Local and External Attention Network for Enhanced Dehazing of Remote Sensing Images | IEEE Journals & Magazine | IEEE Xplore

CLEAR-Net: A Cascaded Local and External Attention Network for Enhanced Dehazing of Remote Sensing Images


Abstract:

Atmospheric haze degrades remote sensing images by obscuring critical details, diminishing data interpretation accuracy. This paper presents CLEAR-Net: Cascaded Local and...Show More

Abstract:

Atmospheric haze degrades remote sensing images by obscuring critical details, diminishing data interpretation accuracy. This paper presents CLEAR-Net: Cascaded Local and External Attention Resource Network, a novel dehazing framework designed to enhance clarity and detail in remote sensing images. CLEAR-Net employs a two-block cascaded architecture to address both global and local haze effects. The first block utilizes an Outer Global Attention (OGA) mechanism, processing global features via a U-Net Inner Attention (UIA) module to mitigate widespread haze. The second block focuses on Outer Local Attention (OLA), refining local image features through reintegration with the UIA module. An ensemble method selects the most effective model weights based on perceptual quality metrics, and a composite loss function emphasizes the preservation of high-frequency details to maintain textural and edge fidelity. CLEAR-Net's efficacy is evaluated on the SateHaze1k and RICE datasets, outperforming existing state-of-theart dehazing techniques. On SateHaze1k, CLEAR-Net achieves an average PSNR of 27.876 dB, surpassing the next best model by approximately 4.81%; under moderate fog, it attains a PSNR of 30.824 dB, about 9.8% higher than existing methods. On the RICE dataset, which includes challenging cloud cover, CLEARNet achieves PSNRs of 36.873 dB on RICE1 and 34.624 dB on RICE2, improving over next-best models by approximately 18.2% and 12%, respectively, and attains the lowest LPIPS values, indicating superior perceptual quality.
Page(s): 1 - 20
Date of Publication: 27 January 2025

ISSN Information: