By Topic

Lower bounds on the capacities of binary and ternary networks storing sparse random vectors

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Baram, Y. ; Technion-Israel Inst. of Technol., Haifa, Israel ; Sal'ee, D.

It is shown that the memory capacity of networks of binary neurons storing, by the Hebbian rule, sparse random vectors over the field {0, 1}N is at least c(N/p log N ), where c is a positive scalar involving input error probabilities probability of an element being nonzero. A similar bound is derived for networks of ternary neurons, storing sparse vectors over {-1,0,1}N. These results, pertaining to stability and error correction with probability tending to one as the number of neurons tends to infinity, generalize and extend previously known capacity bounds for binary networks storing vectors of equally probable {±1} bits. Lower bounds on the capacities of binary and ternary networks of finite sizes are also derived. These bounds suggest critical network sizes that guarantee high gains in capacity per neuron for given sparsites

Published in:

Information Theory, IEEE Transactions on  (Volume:38 ,  Issue: 6 )