DIOD: Fast, Semi-Supervised Deep ISAR Object Detection | IEEE Journals & Magazine | IEEE Xplore

DIOD: Fast, Semi-Supervised Deep ISAR Object Detection


Abstract:

Inverse synthetic aperture radar (ISAR) object detection is one of the most challenging problems in computer vision, and most existing ISAR object detection algorithms ar...Show More

Abstract:

Inverse synthetic aperture radar (ISAR) object detection is one of the most challenging problems in computer vision, and most existing ISAR object detection algorithms are complicated and perform poorly. To provide a convenient and high-quality ISAR object detection method, we propose a fast semi-supervised method, called DIOD, which is based on fully convolutional region candidate networks (FCRCNs) and deep convolutional neural networks. First, a region candidate is used to localize potential objects in most of the best detection methods, but this approach often results in the most intractable computational bottleneck. Thus, to perform localization robustly and accurately in minimal time, we propose an FCRCN with “seed” boxes at multiple scales and aspect ratios. This approach offers almost cost-free candidate computation and achieves excellent performance. Second, to overcome the lack of labeled training data, the model undergoes an efficient semi-supervised pretraining process followed by fine-tuning, which produces successful results. Finally, to further improve the accuracy and speed of the detection system, we introduce a novel sharing mechanism and a joint learning strategy that extract more discriminative and comprehensive features while simultaneously learning the latent shared and individual features and their correlations. Extensive experiments are conducted on two real-world ISAR datasets, and the results show that DIOD outperforms the existing state-of-the-art methods.
Published in: IEEE Sensors Journal ( Volume: 19, Issue: 3, 01 February 2019)
Page(s): 1073 - 1081
Date of Publication: 06 November 2018

ISSN Information:

Funding Agency:


References

References is not available for this document.