Skip to Main Content
Big brown bats emit brief frequency-modulated (FM) biosonar signals in the 20-100-kHz frequency band. They use a unique delay-based processing method to focus target images and defocus clutter. Broadcasts and echoes are received by the ears and encoded as spectrogram-like representations by parallel auditory band-pass filters. Neural responses to successive frequencies in broadcasts travel along parallel neuronal delay-lines to serve as a template for comparison with similar responses to incoming echoes. Coincidence comparisons between responses to broadcasts and echoes are used to form images of echo delay. Interference between multiple reflections from the target introduces nulls at selected frequencies and activates a second, cepstrum-like process for estimating the time separation of the reflections. For lowpass echoes from clutter, the global weakening of higher frequencies defocuses the images of clutter and allows focused target images to emerge from the background. Temporal misalignment of neuronal responses to low-amplitude frequencies leads to a novel computational solution to the clutter problem in sonar. The wider biomimetic lesson is to implement totally parallel computations using only 1-bit timing comparisons, and not to emulate these computations in conventional digital signal processing with cumbersome digital arithmetic operating on multiple-bit bytes.