Skip to Main Content
We show that the minimum classification error (MCE) criterion gives an upper bound to the true Bayes' error rate independent of the corresponding model distribution. In addition, we show that model-free optimization of the MCE criterion leads to a closed form solution in the asymptotic case of infinite training data. While leading to the Bayes' error rate, the resulting model distribution differs from the true distribution. This suggests that the structure of model distributions trained with the MCE criterion should differ from the structure of the true distributions, as they are usually used in statistical pattern recognition.