Skip to Main Content
This paper introduces two methods that use a coherence analysis framework to generate synthetic aperture sonar (SAS)-like images that display acoustic color (AC) information useful for the classification of buried and/or proud underwater objects. The first method is applicable to sonar backscatter collected with multiple hydrophones and involves forming two channels using the data of two hydrophone subarrays at each frequency and over several pings. The second method is intended for applications where sonar backscatter is collected using a single hydrophone and involves forming two channels using the data of two synthetic subarrays at each frequency. In both cases, the resulting SAS-like AC images display information in a ping-frequency plane, and hence convey information that is useful for the detection, localization, and classification of underwater objects based on properties that are typically not conveyed by conventional SAS images. The single-hydrophone version of SAS-like AC processing has the added benefit of generating 3-D images that also display range information for enhanced localization capabilities. Furthermore, this coherence-based SAS-like method does not require the elaborate platform motion estimation and compensation used in conventional SAS. The effectiveness of these methods is demonstrated on two real sonar databases both by comparing generated SAS-like AC images to those generated using conventional methods and by applying a simple but effective classification framework directly to the AC features in the SAS-like images.