Skip to Main Content
We derive the Neyman-Pearson error exponent for the detection of Gauss-Markov signals using randomly spaced sensors. We assume that the sensor spacings, d 1,d 2,..., are drawn independently from a common density fd(.), and we treat both stationary and nonstationary Markov models. Error exponents are evaluated using specialized forms of the strong law of large numbers, and are seen to take on algebraically simple forms involving the parameters of the Markov processes and expectations over fd(.) of certain functions of d 1. These expressions are evaluated explicitly when fd(.) corresponds to i) exponentially distributed sensors with placement density lambda; ii) equally spaced sensors; and iii) the proceeding cases when sensors fail (or equivalently, are asleep) with probability q. Many insights follow. For example, we determine the optimal lambda as a function of q in the nonstationary case. Numerical simulations show that the error exponent predicts trends of the simulated error rate accurately even for small data sizes.