Skip to Main Content
A situation of great practical importance in pattern recognition is the case where the designer has only a finite number of sample patterns from each class and the class-conditional density functions are not completely known. Recent results indicate that in this case the dimensionality of the pattern vector, i.e., the number of measurements, should not be arbitrarily increased, since above a certain value (corresponding to the optimal measurement complexity), the performance starts to deteriorate instead of improving steadily. However, whether this phenomenon occurs in the case of independent measurements has been an open question until now. In this paper the following result of practical importance is derived. When the measurements are independent, and a Bayesian approach is taken, one can add extra measurements without fear of this peaking of performance; i.e., the optimal measurement complexity is infinite. In fact, under certain conditions, having just one sample from class 1, and none at all from class 2, can result in a recognition accuracy arbitrarily close to unity for a large enough number of measurements. The implication of these results to practice is discussed, along with the general question of dimensionality and sample size.