By Topic

Stochastic convergence analysis of the single-layer backpropagation algorithm for noisy input data

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Bershad, N.J. ; Dept. of Electr. & Comput. Eng., California Univ., Irvine, CA, USA ; Cubaud, N. ; Shynk, J.J.

The statistical learning behavior of the single-layer backpropagation algorithm was analyzed for a system identification formulation for noise-free training data, transient and steady-state results were obtained for the mean weight behavior, mean-square error (MSE), and probability of correct classification. The article extends these results to the case of noisy training data, three new analytical results are obtained (1) the mean weights converge to finite values, (2) the MSE is bounded away from zero, and (3) the probability of correct classification does not converge to unity. However, over a wide range of signal-to-noise ratio (SNR), the noisy training data does not have a significant effect on the perceptron stationary points relative to the weight fluctuations. Hence, one concludes that noisy training data has a relatively small effect on the ability of the perceptron to learn the underlying weight vector F of the training signal model

Published in:

Signal Processing, IEEE Transactions on  (Volume:44 ,  Issue: 5 )