Skip to Main Content
Cross correlating two received waveforms is an accepted method of estimating the differential time delay (DTD) of a common signal in two waveforms, as well as detecting the signal. Rarely is the DTD actually constant over the correlation integration time, although this assumption can be made if the variation is small enough compared to the integration time and the signal bandwidth. This paper considers the effects of processing the data while assuming constant DTD, when the DTD usually has a slow linear time variation [i.e., the signal exhibits linear relative time companding (RTC)], for the case of low-pass broad-band signal and noise. The effects are quantified in terms of two performance measures: output signal-noise ratio (SNR), and accuracy of DTD estimation. Previous results on output SNR are briefly reviewed and extended, and the effect of uncompensated RTC on DTD estimation accuracy is derived and confirmed with computer simulation results. The two effects are compared numerically for the special case of low-pass white signal and noise, and it is shown that, for this case, DTD estimation accuracy degrades more rapidly than does output SNR as the amount of uncompensated RTC increases. The results presented are also applicable to the effects of residual RTC mismatch when an RTC-compensating correlator is used, but imperfect compensation is achieved.