By Topic

Stochastic convergence analysis of a two-layer backpropagation algorithm for a nonlinear system identification model

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
N. J. Bershad ; Dept. of Electr. & Comput. Eng., California Univ., Irvine, CA, USA ; J. J. Shynk

The stationary points of a two-layer perceptron which attempts to identify the parameters of a specific nonlinear system are studied. The training sequence is modeled as the binary output of the nonlinear system when the input is an independent sequence of zero-mean Gaussian vectors with independent components. The training rule backpropagates the error at the input to the outer layer nonlinearity rather than the error at the output of that nonlinearity. Coupled nonlinear equations are derived for the hidden and output layer weights. These equations define a multiplicity of stationary points. One solution to these equations indicates that the hidden layer weights match those of the nonlinear system and that the outer layer weights minimize the MSE. The second layer output can be made to match the training sequence by the appropriate choice of bias. Hence, the two-layer perceptron correctly identifies the parameters of the unknown system even though the training rule does not propagate the output error

Published in:

Circuits and Systems, 1992. ISCAS '92. Proceedings., 1992 IEEE International Symposium on  (Volume:1 )

Date of Conference:

10-13 May 1992