By Topic

Corrective memory by a symmetric sparsely encoded network

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Baram, Y. ; Dept. of Comput. Sci., Technion-Israel Inst. of Technol., Haifa, Israel

A neural network that retrieves stored binary vectors, when probed by possibly corrupted versions of them, is presented. It employs sparse ternary internal coding and autocorrelation (Hebbian) storage. It is symmetrically structured and, consequently, can be folded into a feedback configuration. Bounds on the network parameters are derived from probabilistic considerations. It is shown that when the input dimension is n, the proportional activation radius is ρ and the network size is 2νn with ν>1-h2(ρ), the equilibrium capacity is at least 2αn/8nρ(1-ρ) for any α<1-h2(ρ), where h2(·) is the binary entropy. A similar capacity bound is derived for the correction of errors of proportional size ρ or less, when ρ⩽0.3. The performance of a finite-size symmetric network is examined by simulation and found to exceed, at the cost of higher connectivity, that of the Kanerva (1988) model, operating as a content addressable memory

Published in:

Information Theory, IEEE Transactions on  (Volume:40 ,  Issue: 2 )