Abstract:
Ship detection based on wide-area remote sensing imagery has a wide range of applications in areas such as ship supervision and rescue at sea. However, wide-area remote s...Show MoreMetadata
Abstract:
Ship detection based on wide-area remote sensing imagery has a wide range of applications in areas such as ship supervision and rescue at sea. However, wide-area remote sensing satellites sacrifice spatial resolution and spectral resolution to cover a larger sea area, which leads to smaller ship scales, fewer source pixels, and a lack of texture details in the images. Meanwhile, the sparse distribution of ships at sea makes the target class and background class samples unbalanced. In this paper, we propose a deep learning network, LKPF-YOLO, for detecting small-target ships in wide-area remote sensing images. For this purpose, we first create a South China Sea wide-area remote sensing dataset containing about 7600 ship instances. The dataset employs grayscale transformation to marginalize color ranges unrelated to ships, reducing interference from clouds and clutter. In order to extract features of small objects and low-contrast targets more efficiently, we design a re-parameterized large kernel module, C2Rep, to give the network a larger effective sensing field and richer gradient flow information. In addition, we use depthwise separable convolution and the C3 feature extraction module to reduce the high complexity caused by large convolution kernels. Finally, we design a loss function, Priori Focal Loss, based on unbalanced learning and prior knowledge, which dynamically adjusts the loss weights according to the confidence scores and the ground truth box area, thus guiding the model to focus more on the training of small and difficult samples. The experimental results show that the model achieves accurate and stable small-target ship detection in wide-area remote sensing datasets and has excellent generalization ability in complex scenes and multi-category detection. The mAP_{50} (mean Average Precision at IoU=0.50) and mAP_{50:95} of the model reached 93.6% and 50.7%, which were 5.5% and 12.9% higher than the original model, respectively. In addition, the num...
Published in: IEEE Transactions on Aerospace and Electronic Systems ( Early Access )