By Topic

Magnified gradient function in adaptive learning: the MGFPROP algorithm

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $33
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Sin-Chun Ng ; Dept. of Comput. & Math., Inst. of Vocational Educ., Hong Kong, China ; Chi-Chung Cheung ; Shu-Hung Leung

A new algorithm is proposed to solve the “flat spot” problem in backpropagation neural networks by magnifying the gradient function. Simulation results show that, in terms of the convergence rate and the percentage of global convergence, the new algorithm consistently outperforms other traditional methods

Published in:

Electronics Letters  (Volume:37 ,  Issue: 1 )