A methodology is presented for analyzing the error associated with time delay estimation using a finite integration time correlator, processing waveforms received at two separate sensors. The type of signal considered is a sinusoid whose amplitude is randomly modulated. The signals are assumed to be imbedded in additive Gaussian noise. Before they are correlated, the received waveforms are converted to a lower center frequency by mixers whose local oscillators are assumed to contain phase noise. By direct calculation in the time domain, the variance of the error in the time delay estimate is shown to be a function of integration time, signal-to-noise ratios, signal and noise bandwidths, and phase noise variance. The phase noise is shown to limit the accuracy of the time delay estimate. However, without phase noise, using the methodology the accuracy is shown to approach that obtained by the maximum likelihood estimator.