By Topic

Statistical analysis of a two-layer backpropagation algorithm used for modeling nonlinear memoryless channels: the single neuron case

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Bershad, N.J. ; Dept. of Electr. & Comput. Eng., California Univ., Irvine, CA, USA ; Ibnkahla, M. ; Castanie, F.

Neural networks have been used for modeling the nonlinear characteristics of memoryless nonlinear channels using backpropagation (BP) learning with experimental training data. In order to better understand this neural network application, this paper studies the transient and convergence properties of a simplified two-layer neural network that uses the BP algorithm and is trained with zero mean Gaussian data. The paper studies the effects of the neural net structure, weights, initial conditions, and algorithm step size on the mean square error (MSE) of the neural net approximation. The performance analysis is based on the derivation of recursions for the mean weight update that can be used to predict the weights and the MSE over time. Monte Carlo simulations display good to excellent agreement between the actual behavior and the predictions of the theoretical model

Published in:

Signal Processing, IEEE Transactions on  (Volume:45 ,  Issue: 3 )