Skip to Main Content
Detection algorithms in a nonhomogeneous clutter environment have to contend with the problem of excessive false alarm rates when the clutter is not sufficiently suppressed. Prior to signal detection, adaptive receivers must either "null" or "whiten" the clutter. The former approach requires estimates of the basis vectors of the clutter subspace while the latter requires estimates of the covariance matrix of the clutter. It may be preferable to null the clutter when the dimensionality of the clutter subspace (M) is small in comparison to the length of the data vector coherently processed (N) and also when the signal to be detected is not substantially within the clutter-plus-noise subspace. Depending on the scenario, radar clutter data may have a variety of effects such as clutter discretes, data dropouts due to terrain masking that cause shadowing, multipath effects, and non-Gaussian amplitude statistics. We consider a statistical model which can account for some of these effects and apply the model to design an algorithm to obtain a maximum likelihood estimate (MLE) of the orthonormal basis vectors of the clutter-plus-noise subspace. We show that the problem can be reformulated to develop a recursive approach for MLE based on the expectation-maximization (EM) algorithm. Results of the analysis show that each training vector must be scaled by a nonnegative scale factor from the interval [0,1] prior to subspace estimation. The scale factor applied to each training vector increases monotonically with the subspace clutter-plus-noise power of the vector. Conditions for the existence and uniqueness of the solution are considered. A statistical interpretation of the solution is provided along with sample simulation results.