By Topic

Statistical analysis of the single-layer backpropagation algorithm for noisy training data

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Bershad, N.J. ; Dept. of Electr. & Comput. Eng., California Univ., Irvine, CA, USA ; Cubaud, N. ; Shynk, J.J.

The statistical learning behavior of the single-layer backpropagation algorithm was analyzed using a system identification formulation for noise-free training data [Bershad et al. 1993]. Transient and steady-state results were obtained for the mean weight behavior, mean-square error (MSE), and probability of correct classification. The present paper extends these results to the case of noisy training data. Three new analytical results are obtained: 1) the mean weights converge to finite values even when the bias terms are zero, 2) the MSE is bounded away from zero, and 3) the probability of correct classification does not converge to unity. However, over a wide range of signal-to-noise ratios (SNRs), the noisy training data does not have a significant effect on the perceptron stationary points relative to the weight fluctuations. Hence, one concludes that noisy training data has a relatively small effect on the ability of the perceptron to learn the model weight vector F

Published in:

Acoustics, Speech, and Signal Processing, 1995. ICASSP-95., 1995 International Conference on  (Volume:5 )

Date of Conference:

9-12 May 1995