Skip to Main Content
Most of the recent compressive sensing (CS) literature has focused on sparse signal recovery based on compressive measurements. However, exact signal recovery may not be required in certain signal processing applications such as in inference problems. In this paper, we provide performance limits of classification of sparse as well as not necessarily sparse signals based on compressive measurements. When signals are not necessarily sparse, we show that Kullback-Leibler and Chernoff distances between two probability density functions under any two hypotheses are preserved up to a factor of M/N with M(<;N)-length compressive measurements compared to that with N-length original measurements when the pdfs of the original-length observation vectors exhibit certain properties. These results are used to quantify the performance limits in terms of upper and lower bounds on the probability of error in signal classification with M-length compressive measurements. When the signals of interest are sparse in the standard canonical basis, performance limits are derived in terms of lower bounds on the probability of error in classifying sparse signals with any classification rule.