Skip to Main Content
An important characteristic of coherent integrators is that their effective bandwidth decreases as the integration time increases. If it is only known that a weak signal occurs somewhere in a given frequency range, then the number of integration channels required to cover the specified range increases as the amount of coherent integration is increased. However, each integration channel can independently cause a false alarm, although only the particular channel in which the signal appears can cause a true alarm. The question arises therefore whether it is profitable to lengthen the coherent integration period to increase the signal-to-noise ratio when doing so requires an increase in the number of integration channels. This problem is investigated analytically. Numerical results appropriate for system design are presented as a series of graphs of missed-signal probability vs number of integration channels, with initial signal-to-noise ratio and over-all false alarm probability as parameters. Also included is a detailed analysis of statistical properties of ideal and approximate ideal coherent integrators.