Skip to Main Content
The correct formulation of the likelihood ratio test (LRT)-based detection schemes for a target in the presence of clutter needs a faithful and mathematically tractable clutter model. The state-of-art detection methods usually assume that the land and sea clutter in whose presence a target signal to be detected as K-distributed form of non-Gaussian clutter and uses an LRT detector obtained for the same. However, the macroscopic phenomena of clutter generation in a search radar and also the detailed studies on the measured clutter data show a significant texture fluctuation caused by the continuous antenna scanning motion. Incorporation of the scanning effect in the clutter model introduces challenges in modelling the varying texture and formulating a suitable detection scheme. Here the authors propose a clutter model for such scanning radar applications taking the effect of scanning on the clutter correlation into consideration. A method of fitting the proposed model to measured data and a method of simulating the clutter as per the proposed model are also presented. The proposed model is found to be attractive for an LRT detector design. Further in this study, the proposed clutter model is also validated through a real experimental data which show an excellent match between the simulated and measured data of both sea and land clutters. The close agreement between the model and the measurement clearly illustrates the validity and applicability of the model.