Loading [MathJax]/extensions/MathZoom.js
MPT-SFANet: Multiorder Pooling Transformer-Based Semantic Feature Aggregation Network for SAR Image Classification | IEEE Journals & Magazine | IEEE Xplore

MPT-SFANet: Multiorder Pooling Transformer-Based Semantic Feature Aggregation Network for SAR Image Classification


Abstract:

The transformer-based methods have demonstrated remarkable advancements in synthetic aperture radar (SAR) classification. Nevertheless, many of these methods ignore globa...Show More

Abstract:

The transformer-based methods have demonstrated remarkable advancements in synthetic aperture radar (SAR) classification. Nevertheless, many of these methods ignore global statistical information and semantic feature interaction for effectively characterizing different SAR land covers under complex structures. Leveraging second-order statistics presents an efficacious approach to well characterize the statistical features of SAR patches. Motivated by this, we integrate pyramid pooling and global covariance pooling techniques into each of the multihead self-attention blocks, thereby facilitating the extraction of powerful contextual features and the global statistical nature of SAR patches, namely multiorder pooling transformer module. Simultaneously, a semantic feature aggregation module is utilized for capturing local deep features and modeling the interaction of feature information across various feature levels. Both of these modules are embedded into a U-shaped architecture, which we refer to as a multiorder pooling transformer-based semantic feature aggregation network (MPT-SFANet). Extensive experimental results on TerraSAR, Sentinel-1B, and GF-3 SAR image classification datasets indicate that MPT-SFANet exceeds several relevant methods.
Published in: IEEE Transactions on Aerospace and Electronic Systems ( Volume: 60, Issue: 4, August 2024)
Page(s): 4923 - 4938
Date of Publication: 29 March 2024

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.