Skip to Main Content
A method for estimating the amplitude and time-delay parameters for signals that can be represented as a sum of a number of scaled and delayed replicas of a known signal in the presence of nonwhite Gaussian noise is presented. The method is based upon the principles of maximum likelihood (ML) estimation, but at certain key points simplifying assumptions are made that result in a computationally practical estimation scheme. The results are compared with other related results in the literature. Simulation results are presented which support the methods derived.