By Topic

Learning by parallel forward propagation

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)

The back-propagation algorithm is widely used for learning weights of multilayered neural networks. The major drawbacks however, are the slow convergence and lack of a proper way to set the number of hidden neurons. The author proposes a learning algorithm which solves the above problems. The weights between two layers are successively calculated, fixing other weights so that the error function, which is the square sum of the difference between the training data and the network outputs is minimised. Since the calculation results in solving a set of linearized equations, redundancy of the hidden neurons is judged by the singularity of the corresponding coefficient matrix. For the exclusive-OR and parity check circuits, excellence convergence characteristics are obtained, and the redundancy of the hidden neurons is checked by the singularity of the matrix

Published in:

Neural Networks, 1990., 1990 IJCNN International Joint Conference on

Date of Conference:

17-21 June 1990