By Topic

Should backpropagation be replaced by more effective optimization algorithms?

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
J. T. Hsiung ; Dept. of Chem. Eng., Texas Univ., Austin, TX, USA ; W. Suewatanakul ; D. M. Himmelblau

The authors propose the use of backpropagation (BP) as the preferred technique of optimizing the values of the weights in an artificial neural network. They compare functional representation via BP and a successive quadratic programming code, with the latter being at least four times faster in achieving the same error tolerance. The proposed strategy has two main features. One is that it forgets about adjusting the weights sequentially from the output layer to the input layer, and instead adjusts the entire set of weights at once. The second feature is that it passes the entire set of patterns through the network on one stage of iteration and uses the sum of the squares of all of the errors for all the patterns as the objective function. Another feature of the strategy is that it uses a nonlinear optimization code that accommodates constraints, such as the generalized reduced gradient method or successive quadratic programming, to adjust all the weights and other parameters

Published in:

Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on  (Volume:i )

Date of Conference:

8-14 Jul 1991