By Topic

A New Fast Learning Algorithm for Multi-Layer Feedforward Neural Networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
De-xian Zhang ; Sch. of Inf. Sci. & Eng., Henan Univ. of Technol., Zhengzhou ; Can Liu ; Zi-Qiang Wang ; Nan-Bo Liu

The strong nonlinear relation between the training sample's impact on the errors and error's derivatives is the fundamental reason underlying the low learning efficiency of the multi-layer forward neural networks. Effectively decreasing the degree of the nonlinear relation and its impact on network learning is critical to improve the neural network's training efficiency. Based on the above idea, this paper propose a new approach to accelerate learning efficiency, including linearization technique of the non-linear relation, the convergence technique based on the local equalization of training sample's errors, and the rotation adjustment of the weights. A new fast learning algorithm for the multi-layer forward neural networks is also presented. The experimental results prove that the new algorithm is capable of shortening the training time by hundreds time and remarkably improve generalization of the neural networks, compared with the conventional algorithms

Published in:

Machine Learning and Cybernetics, 2006 International Conference on

Date of Conference:

13-16 Aug. 2006