By Topic

Second order back-propagation learning algorithm and its application for neural network

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

5 Author(s)
Liu Tienan ; Dept. of Autom. & Control Eng., Daqing Pet. Inst., Heilongjiang, China ; Ren Weijian ; Chen Guangyi ; Xu Baochang
more authors

In this paper, a new second order recursive learning algorithm to multilayer feedforward network is proposed. This algorithm makes not only each layer's errors but also second order derivative information factors backpropagate. And it is proved that it is equivalent to Newton iterative algorithm and has second order convergent speed. New algorithm achieves the recurrence calculation of Newton search directions and the inverse of Hessian matrices. Its calculation complexity corresponds to that of common recursive least squares algorithm. It is stated clearly that this new algorithm is superior to Karayiannis' second order algorithm (1994) according to analysis of their properties

Published in:

Intelligent Control and Automation, 2000. Proceedings of the 3rd World Congress on  (Volume:2 )

Date of Conference:

2000