By Topic

An accelerated recurrent network training algorithm using IIR filter model and recursive least squares method

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
T. W. S. Chow ; Dept. of Electron. Eng., City Univ. of Hong Kong, Kowloon, Hong Kong ; Siu-Yeung Cho

A new approach for the training algorithm of a fully connected recurrent neural network based upon the digital filter theory is proposed. Each recurrent neuron is modeled by an infinite impulse response (IIR) filter. The weights of each layers in the network are updated by optimizing IIR filter coefficients and optimization is based on the recursive least squares (RLS) method. Our results indicate that the proposed algorithm is capable of providing an extremely fast convergence rate. In this letter, the algorithm is validated by applying to sunspots time series, Mackey-Glass time series and nonlinear function approximation problems. The convergence speed of the RLS based algorithm are compared with other fast algorithms. In the obtained results, they show that the proposed algorithm could be up to 200 times faster than that of the conventional backpropagation algorithm

Published in:

IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications  (Volume:44 ,  Issue: 11 )