Skip to Main Content
One of the most used intelligent technique for classification is a neural network. In real classification applications the patterns of different classes often overlap. In this situation the most appropriate classifier is the one whose outputs represent the class conditional probabilities. These probabilities are calculated in traditional statistics in two steps: first the underlying prior probabilities are estimated and then the Bayes rule is applied. One of the most popular methods for density estimation is Gaussian Mixture. It is also possible to calculate directly the class conditional probabilities using a Multilayer Perceptron Artificial Neural Network. Although it is not known yet which method is better in the general case, we demonstrate in this paper that Multilayer Perceptron is superior to Gaussian Mixture Model when the underlying prior probability densities are discontinuous along the support's border.
Date of Conference: 23-29 Aug. 2009