Cart (Loading....) | Create Account
Close category search window
 

An efficient learning algorithm for associative memories

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Yingquan Wu ; Dept. of Electr. Eng., State Univ. of New York, Buffalo, NY, USA ; Batalama, S.N.

Associative memories (AMs) can be implemented using networks with or without feedback. We utilize a two-layer feedforward neural network and propose a learning algorithm that efficiently implements the association rule of a bipolar AM. The hidden layer of the network employs p neurons where p is the number of prototype patterns. In the first layer, the input pattern activates at most one hidden layer neuron or “winner”. In the second layer, the “winner” associates the input pattern to the corresponding prototype pattern. The underlying association principle is minimum Hamming distance and the proposed scheme can be viewed also as an approximately minimum Hamming distance decoder. Theoretical analysis supported by simulations indicates that, in comparison with other suboptimum minimum Hamming distance association schemes, the proposed structure exhibits the following favorable characteristics: 1) it operates in one-shot which implies no convergence-time requirements; 2) it does not require any feedback; and 3) our case studies show that it exhibits superior performance to the popular linear system in a saturated mode. The network also exhibits 4) exponential capacity and 5) easy performance assessment (no asymptotic analysis is necessary). Finally, since it does not require any hidden layer interconnections or tree-search operations, it exhibits low structural as well as operational complexity

Published in:

Neural Networks, IEEE Transactions on  (Volume:11 ,  Issue: 5 )

Date of Publication:

Sep 2000

Need Help?


IEEE Advancing Technology for Humanity About IEEE Xplore | Contact | Help | Terms of Use | Nondiscrimination Policy | Site Map | Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest professional association for the advancement of technology.
© Copyright 2014 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.