Skip to Main Content
The use of adaptive algorithms to mitigate the detrimental effects of noise on receivers employing antenna arrays is instrumental in modern day radar systems applications. In most of these algorithms, the target is assumed to be confined to only one range cell. Under practical operating conditions, the target can actually be distributed across several range cells. This signal contamination causes the performance of the adaptive algorithm to degrade. Also, a covariance matrix is used for clutter-plus-noise in the design of the adaptive algorithm. This quantity is usually characterized by using samples taken from range cells surrounding the test cell. Performance suffers if the underlying test cell covariance matrix is different from the average covariance matrix of the surrounding range cells. We analyze a space-time adaptive processing (STAP) algorithm designed to utilize signal contamination to the advantage of the receiver. Expressions for performance, incorporating the possibility of covariance matrix mismatch, are developed for such distributed target scenarios. Numerical analysis illustrates that the presented algorithm functions significantly better than traditional STAP algorithms in signal contaminated environments. This investigation also shows how variations in the parameters that describe covariance matrix mismatch affect performance.