Skip to Main Content
The problem of resolving closely spaced signal sources using an antenna array remains a difficult one, although several estimation methods are available in the literature. When the array correlation matrix is known, the resolution capability of subspace algorithms is infinitely high. However, in the presence of modeling errors the resolution deteriorates, even for a known correlation matrix. In this paper, we analyze the MUSIC method, by way of three different definitions of the resolution. Assuming Gaussian circular random modeling errors, we determine the corresponding expressions of the probability of source resolution versus the model mismatch. A first series of simulations validates the mathematical expression of the three resolution probabilities. A second series of simulations is used to select among them the tightest one to the empirical one. The results are useful, e.g., for determining the necessary antenna calibration accuracy to achieve a target performance.