Skip to Main Content
In this paper we study the spectral properties of an auto-regressive (AR) spectral estimator proposed by Cadzow, which uses a large number of correlations to set up the normal equations. This Set of overdetermined equations is then solved in a least squares sense. The main contribution of this paper is the derivation of asymptotic relationships, i.e., as the number of samples goes to infinity, between the number of correlations used, the model order and the signal-to-noise ratio of the signal, and the characteristics of the resulting spectral estimate when the signal under study is composed of sinusoids in noise. The characteristics studied are the height, bandwidth, and area of the peaks in the estimated spectrum. The method is shown to be a spectral density estimator like the ME method, where spectral areas rather than spectral values should be interpreted as estimates of power. The role of the number of correlations as a signal-to-noise ratio enhancer is discussed. Computer simulations are presented which verify the theoretical results.