By Topic

Classification performance of a Hopfield neural network based on a Hebbian-like learning rule

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
G. M. Jacyna ; UNISYS Corp., Reston, VA, USA ; E. R. Malaret

The theoretical classification performance of a Hopfield neural network is presented. An important link between empirically based investigations of neural network classification models and the correct application of these models to AI-based systems is established. General expressions are derived relating the performance of the Hopfield model to the number and dimensionality of code vectors stored in memory. The average performance of the network is analyzed by randomizing the subsequent code vectors and examining classification relative to the output bit errors. An exact probabilistic description of the network is derived for the first iteration, and an approximate second-moment analysis generalizable to multiple iterations examines performance near a fixed point. Degradations generated by noisy or incomplete input data are analyzed. The results show that the Hopfield net has major limitations when applied to fixed pattern classification problems because of its sensitivity to the number of code vectors stored in memory and the signal-to-noise ratio of the input data

Published in:

IEEE Transactions on Information Theory  (Volume:35 ,  Issue: 2 )