By Topic

The self-trapping attractor neural network. I. Analysis of a simple 1-D model

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Pavloski, R. ; Depts. of Psychol. & Phys., Univ. of Pennsylvania, Indiana, PA, USA ; Karimi, M.

Attractor neural networks (ANNs) based on the Ising model are naturally fully connected and are homogeneous in structure. These features permit a deep understanding of the underlying mechanism, but limit the applicability of these models to the brain. A more biologically realistic model can be derived from an equally simple physical model by utilizing recurrent self-trapping inputs to supplement very sparse intranetwork interactions. This paper reports the analysis of a one-dimensional (1-D) ANN coupled to a second system that computes overlaps with a single stored memory. Results show that: 1) the 1-D self-trapping model is equivalent to an isolated ANN with both full connectivity of one strength and nearest neighbor synapses of an independent strength; 2) the dynamics of ANN and self-trapping updates are independent; 3) there is a critical synaptic noise level below which memory retrieval occurs; 4) the 1-D self-trapping model converges to a fully connected Hopfield model for zero strength nearest neighbor synapses, and has a greater magnitude memory overlap for nonzero strength nearest neighbor synapses; and (5) the mechanism of self-trapping is an iterative map on the mean overlap as a function of the reentrant input.

Published in:

Neural Networks, IEEE Transactions on  (Volume:14 ,  Issue: 1 )