By Topic

Learning algorithm for neural networks by solving nonlinear equations

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Aoki, K. ; Dept. of Manage. Inf., Hiroshima Prefectural Univ., Japan ; Kanezashi, M. ; Maeda, C.

The BP (backpropagation) process is a popular learning algorithm for neural networks. Despite of many successful applications, the BP process has some known drawbacks. These drawbacks stem from that the BP process is a gradient based optimization procedure without a linear search. In this paper, a new learning algorithm is presented based on a solution method of nonlinear equations. Compared with the former optimization procedure, the proposed method often converges faster to desired results. Newton's method is basically applied to solve the nonlinear equations. However, the major difficulty with Newton's method is that its convergence depends on an initial point. In order to assure a global convergence, independent of an initial point, the Homotopy continuation method is employed.

Published in:

Neural Networks to Power Systems, 1993. ANNPS '93., Proceedings of the Second International Forum on Applications of

Date of Conference: