By Topic

Structural stability of unsupervised learning in feedback neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
B. A. Kosko ; Dept. of Electr. Eng., Southern California Univ, Los Angeles, CA, USA

Structural stability is proved for a large class of unsupervised nonlinear feedback neural networks, adaptive bidirectional associative memory (ABAM) models. The approach extends the ABAM models to the random-process domain as systems of stochastic differential equations and appends scaled Brownian diffusions. It is also proved that this much larger family of models, random ABAM (RABAM) models, is globally stable. Intuitively, RABAM equilibria equal ABAM equilibria that vibrate randomly. The ABAM family includes many unsupervised feedback and feedforward neural models. All RABAM models permit Brownian annealing. The RABAM noise suppression theorem characterizes RABAM system vibration. The mean-squared activation and synaptic velocities decrease exponentially to their lower hounds, the respective temperature-scaled noise variances. The many neuronal and synaptic parameters missing from such neural network models are included, but as net random unmodeled effects. They do not affect the structure of real-time global computations

Published in:

IEEE Transactions on Automatic Control  (Volume:36 ,  Issue: 7 )