In the framework of multiple classifier systems, we suggest to reformulate the classifier combination problem as a pattern recognition one. Following this approach, each input pattern is associated to a feature vector composed by the output of the classifiers to be combined. A Bayesian Network is used to automatically infer the probability distribution for each class and eventually to perform the final classification. We propose to use Bayesian Networks because they not only provide a basis for efficient probabilistic inference, but also a natural and compact way to encode exponentially sized joint probability distributions. Two systems adopting an ensemble of Back-Propagation neural network and an ensemble of Learning Vector Quantization neural network, respectively, have been tested on the Image database from the UCI repository. The performance of the proposed systems have been compared with those exhibited by multi-expert systems adopting the same ensembles, but the Majority Vote, the Weighted Majority vote and the Borda Count for combining them.