Skip to Main Content
Techniques from coding theory are applied to study rigorously the capacity of the Hopfield associative memory. Such a memory stores -tuple of 's. The components change depending on a hard-limited version of linear functions of all other components. With symmetric connections between components, a stable state is ultimately reached. By building up the connection matrix as a sum-of-outer products of fundamental memories, one hopes to be able to recover a certain one of the memories by using an initial -tuple probe vector less than a Hamming distance away from the fundamental memory. If fundamental memories are chosen at random, the maximum asympotic value of in order that most of the original memories are exactly recoverable is . With the added restriction that every one of the fundamental memories be recoverable exactly, can be no more than asymptotically as approaches infinity. Extensions are also considered, in particular to capacity under quantization of the outer-product connection matrix. This quantized memory capacity problem is closely related to the capacity of the quantized Gaussian channel.