Loading [MathJax]/extensions/MathZoom.js
DAMNet: Dual Attention Mechanism Deep Neural Network for Underwater Biological Image Classification | IEEE Journals & Magazine | IEEE Xplore

DAMNet: Dual Attention Mechanism Deep Neural Network for Underwater Biological Image Classification


The framework consists of three main components: the incorporation of dual attention, the stacking of Transformer blocks, and the multi-stage distribution that ultimately...

Abstract:

Due to the complex background and biodiversity of underwater biological images makes the identification of marine organisms difficult. To solve these above problems, we p...Show More

Abstract:

Due to the complex background and biodiversity of underwater biological images makes the identification of marine organisms difficult. To solve these above problems, we propose a dual attention mechanism deep neural network for underwater biological image classification (DAMNet). Firstly, tthe proposed DAMNet uses multi-stage stacking to suppress the complex underwater background, and the multiple stacking can reduce the number of parameters of the model and improve the generalization ability. Secondly, the dual attention mechanism module is combined with the improved reverse residual bottleneck based on deep convolution to extract the feature information of underwater biological images from space and channel aspects to obtain better discrimination and feature extraction capability. Finally, the gravity optimizer is selected to update the model weights, and the exponential translation can improve the model’s convergence speed and learning rate. Extensive experiments on a dataset consisting of seven types of underwater biological images demonstrate that the DAMNet model has higher learning ability and robustness compared to the state-of-the-art methods. Our DAMNet model achieves 96.93% classification accuracy in all categories, which is at least a 2 percentage point improvement compared to other models.
The framework consists of three main components: the incorporation of dual attention, the stacking of Transformer blocks, and the multi-stage distribution that ultimately...
Published in: IEEE Access ( Volume: 11)
Page(s): 6000 - 6009
Date of Publication: 06 December 2022
Electronic ISSN: 2169-3536

Funding Agency:


References

References is not available for this document.