By Topic

Orthogonal least squares algorithm for training multi-output radial basis function networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $31
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Chen, S. ; Edinburgh Univ., UK ; Grant, P.M. ; Cowan, C.F.N.

The radial basis function (RBF) network offers a viable alternative to the two-layer neural network in many signal processing applications. A novel learning algorithm for RBF networks (S. Chen et al., 1990, 1991) has been derived based on the orthogonal least squares (OLS) method operating in a forward regression manner (Chen et al., 1989). This is a rational way to choose RBF centres from data points because each selected centre maximizes the increment to the explained variance of the desired output and the algorithm does not suffer numerical ill-conditioning problems. This learning algorithm was originally derived for RBF networks with a scalar output. The present study extends this previous result to multi-output RBF networks. The basic idea is to use the trace of the desired output covariance as the selection criterion instead of the original variance in the single-output case. Reconstruction of PAM signals and nonlinear system modelling are used as two examples to demonstrate the effectiveness of this learning algorithm

Published in:

Artificial Neural Networks, 1991., Second International Conference on

Date of Conference:

18-20 Nov 1991