By Topic

Structure-based neural network learning

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
N. Peterfreund ; Center for Eng. Syst. Adv. Res., Oak Ridge Nat. Lab., TN, USA ; A. Guez

We present a new learning algorithm for the structure of recurrent neural networks. It is shown that any m linearly independent n-dimensional vectors can be stored in at most (n+m-2)-dimensional symmetric network. A storage procedure which satisfies this bound is presented. We propose a new learning procedure for the domain of attraction which preserves both the equilibrium set and the stability property of the original system. It is shown that previously learned attraction regions remain invariant under the proposed learning rule, Our emphasis throughout this brief is on the design of associative memories and classifiers

Published in:

IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications  (Volume:44 ,  Issue: 12 )