By Topic

Diminishing the number of nodes in multi-layered neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Nocera, P. ; Lab. d''Inf., Univ. d''Avignon et des Pays de Vaucluse, Avignon, France ; Quelavoine, R.

We propose in this paper two ways for diminishing the size of a multilayered neural network trained to recognise French vowels. The first deals with the hidden layers: the study of the variation of the outputs of each node gives us information on its very discrimination power and then allows us to reduce the size of the network. The second involves the input nodes: by the examination of the connecting weights between the input nodes and the following hidden layer, we can determinate which features are actually relevant for our classification problem, and then eliminate the useless ones. Through the problem of recognising the French vowel /a/, we show that we can obtain a reduced structure that still can learn

Published in:

Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on  (Volume:7 )

Date of Conference:

27 Jun-2 Jul 1994