By Topic

Adaptive Nonmonotone Conjugate Gradient Training Algorithm for Recurrent Neural Networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Chun-Cheng Peng ; Univ. of London, London ; George D. Magoulas

Recurrent networks constitute an elegant way of increasing the capacity of feedforward networks to deal with complex data in the form of sequences of vectors. They are well known for their power to model temporal dependencies and process sequences for classification, recognition, and transduction. In this paper, we propose a nonmonotone conjugate gradient training algorithm for recurrent neural networks, which is equipped with an adaptive tuning strategy for the nonmonotone learning horizon. Simulation results show that this modification of conjugate gradient is more effective than the original CG in four applications using three different recurrent network architectures.

Published in:

19th IEEE International Conference on Tools with Artificial Intelligence(ICTAI 2007)  (Volume:2 )

Date of Conference:

29-31 Oct. 2007