By Topic

Temporal backpropagation for FIR neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)

The traditional feedforward neural network is a static structure which simply maps input to output. To better reflect the dynamics in a biological system, a network structure which models each synapse by a finite-impulse response (FIR) linear filter is proposed. An efficient-gradient descent algorithm which is shown to be a temporal generalization of the familiar backpropagation algorithm is derived. By modeling each synapse as a linear filter, the neural network as a whole may be thought of as an adaptive system with its own internal dynamics. Equivalently, one may think of the network as a complex nonlinear filter. Applications should thus include areas of pattern recognition where there is an inherent temporal quality to the data, such as in speech recognition. The networks should also find a natural use in areas of nonlinear control, and other adaptive signal processing and filtering applications such as noise cancellation or equalization

Published in:

Neural Networks, 1990., 1990 IJCNN International Joint Conference on

Date of Conference:

17-21 June 1990