Cart (Loading....) | Create Account
Close category search window

Combination of radial basis function neural networks with optimized learning vector quantization

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Vogt, M. ; Inst. for Parallel & Distributed High Performance Syst., Stuttgart Univ., Germany

Randomly initialized radial basis function neural networks are compared to networks whose centers are obtained by using vector quantization. It is shown that the error rate for small networks can be decreased by about 28%. To achieve the same performance with a trained network as with a randomly initialized network, only half of the number of hidden neurons is needed. This may be important for time critical applications. The time used for the training and initialization of a smaller network is comparable to the time used for the initialization of a larger network

Published in:

Neural Networks, 1993., IEEE International Conference on

Date of Conference:


Need Help?

IEEE Advancing Technology for Humanity About IEEE Xplore | Contact | Help | Terms of Use | Nondiscrimination Policy | Site Map | Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest professional association for the advancement of technology.
© Copyright 2014 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.