By Topic

Convergence models for Rosenblatt's perceptron learning algorithm

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
S. N. Diggavi ; Dept. of Electr. & Comput. Eng., California Univ., Santa Barbara, CA, USA ; J. J. Shynk ; N. J. Bershad

Presents a stochastic analysis of the steady-state and transient convergence properties of a single-layer perceptron for fast learning (large step-size, input-power product). The training data are modeled using a system identification formulation with zero-mean Gaussian inputs. The perceptron weights are adjusted by a learning algorithm equivalent to Rosenblatt's perceptron convergence procedure. It is shown that the convergence points of the algorithm depend on the step size μ and the input signal power (variance) σx2 , and that the algorithm is stable essentially for μ>0. Two coupled nonlinear recursions are derived that accurately model the transient behavior of the algorithm. The authors also examine how these convergence results are affected by noisy perceptron input vectors. Computer simulations are presented to verify the analytical models

Published in:

IEEE Transactions on Signal Processing  (Volume:43 ,  Issue: 7 )