Skip to Main Content
We investigated the robustness of the linear detector operating in a Gaussian environment in the presence of a mismatch between the design interference covariance matrix and the actual one. We have suggested that the Gaussian environment consists of a known colored clutter, a white noise and a strong unwanted periodical signal with unknown nonstationary power. It has been obtained the asymptotic inverse covariance matrix of the interference when the unwanted signal power tends to infinite. Using this formula we developed the asymptotic likelihood-ratio test (LRT). The performance of the new test statistic is analyzed and compared with well known optimal detector. The effect of the unwanted signal removing on the performance is evaluated for an example scenario.