By Topic

On-line learning using hierarchical mixtures of experts

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $31
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Tham, C.K. ; Nat. Univ. of Singapore, Singapore

In the hierarchical mixtures of experts (HME) framework, outputs from several function approximators specializing in different parts of the input space are combined. Fast learning algorithms derived from the expectation-maximization algorithm have previously been proposed, but they are predominantly for batch learning. In this paper, several online learning algorithms are developed for the HME. Their performance in a piecewise linear regression task are compared according to criteria such as speed of convergence, quality of solutions, and storage and computational costs

Published in:

Artificial Neural Networks, 1995., Fourth International Conference on

Date of Conference:

26-28 Jun 1995