By Topic

Using Bayesian Network for combining classifiers

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)

In the framework of multiple classifier systems, we suggest to reformulate the classifier combination problem as a pattern recognition one. Following this approach, each input pattern is associated to a feature vector composed by the output of the classifiers to be combined. A Bayesian Network is used to automatically infer the probability distribution for each class and eventually to perform the final classification. We propose to use Bayesian Networks because they not only provide a basis for efficient probabilistic inference, but also a natural and compact way to encode exponentially sized joint probability distributions. Two systems adopting an ensemble of Back-Propagation neural network and an ensemble of Learning Vector Quantization neural network, respectively, have been tested on the Image database from the UCI repository. The performance of the proposed systems have been compared with those exhibited by multi-expert systems adopting the same ensembles, but the Majority Vote, the Weighted Majority vote and the Borda Count for combining them.

Published in:

Image Analysis and Processing, 2007. ICIAP 2007. 14th International Conference on

Date of Conference:

10-14 Sept. 2007