Loading [a11y]/accessibility-menu.js
Boosted Mixture of Experts: An Ensemble Learning Scheme | MIT Press Journals & Magazine | IEEE Xplore

Boosted Mixture of Experts: An Ensemble Learning Scheme

;

Abstract:

We present a new supervised learning procedure for ensemble machines, in which outputs of predictors, trained on different distributions, are combined by a dynamic classi...Show More

Abstract:

We present a new supervised learning procedure for ensemble machines, in which outputs of predictors, trained on different distributions, are combined by a dynamic classifier combination model. This procedure may be viewed as either a version of mixture of experts (Jacobs, Jordan, Nowlan, & Hinton, 1991), applied to classification, or a variant of the boosting algorithm (Schapire, 1990). As a variant of the mixture of experts, it can be made appropriate for general classification and regression problems by initializing the partition of the data set to different experts in a boostlike manner. If viewed as a variant of the boosting algorithm, its main gain is the use of a dynamic combination model for the outputs of the networks. Results are demonstrated on a synthetic example and a digit recognition task from the NIST database and compared with classical ensemble approaches.
Published in: Neural Computation ( Volume: 11, Issue: 2, 15 February 1999)
Page(s): 483 - 497
Date of Publication: 15 February 1999
Print ISSN: 0899-7667

Contact IEEE to Subscribe