By Topic

A fast hybrid algorithm of global optimization for feedforward neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

7 Author(s)
Jiang Minghu ; Inst. of Inf. Sci., Northern Jiaotong Univ., Beijing, China ; Zhu Xiaoyan ; Yuan Baozong ; Tang Xiaofang
more authors

This paper presents the hybrid algorithm of global optimization of dynamic learning rate for multilayer feedforward neural networks (MLFNN). The effect of inexact line search on conjugacy was studied and a generalized conjugate gradient method based on this effect was proposed and shown to have global convergence for error backpropagation of MLFNN. The descent property and global convergence was given for the improved hybrid algorithm of the conjugate gradient algorithm, the results of the proposed algorithm show a considerable improvement over the Fletcher-Reeves algorithm and the conventional backpropagation (BP) algorithm, it overcomes the drawback of conventional BP and Polak-Ribieve conjugate gradient algorithm that maybe plunge into local minima

Published in:

Signal Processing Proceedings, 2000. WCCC-ICSP 2000. 5th International Conference on  (Volume:3 )

Date of Conference: