By Topic

Using structure of data to improve classification

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
C. M. O'Keefe ; Math. & Inf. Sci., CSIRO, Glen Osmond, SA, Australia ; G. A. Jarrad

Statistical mixture-of-experts models are often used for data analysis tasks such as clustering, regression and classification. We consider two mixture-of-experts models, the shared mixture classifier and the hierarchical mixture-of-experts classifier. We discuss the initialisation and optimisation of the structure and parameters of each classifier. In particular, we initialise the hierarchical mixture of experts classifier with the public domain OC1 decision tree software. We compare the performance of the two classifiers on four datasets, two artificial and two real, finding that the hierarchical mixture-of-experts classifier achieves superior classification performance on the testing data.

Published in:

Information, Decision and Control, 2002. Final Program and Abstracts

Date of Conference:

11-13 Feb. 2002