Cart (Loading....) | Create Account
Close category search window
 

Dynamics analysis and analog associative memory of networks with LT neurons

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Huajin Tang ; Dept. of Electr. & Comput. Eng., Nat. Univ. of Singapore, Singapore ; Tan, K.C. ; Teoh, E.J.

The additive recurrent network structure of linear threshold neurons represents a class of biologically-motivated models, where nonsaturating transfer functions are necessary for representing neuronal activities, such as that of cortical neurons. This paper extends the existing results of dynamics analysis of such linear threshold networks by establishing new and milder conditions for boundedness and asymptotical stability, while allowing for multistability. As a condition for asymptotical stability, it is found that boundedness does not require a deterministic matrix to be symmetric or possess positive off-diagonal entries. The conditions put forward an explicit way to design and analyze such networks. Based on the established theory, an alternate approach to study such networks is through permitted and forbidden sets. An application of the linear threshold (LT) network is analog associative memory, for which a simple design method describing the associative memory is suggested in this paper. The proposed design method is similar to a generalized Hebbian approach, but with distinctions of additional network parameters for normalization, excitation and inhibition, both on a global and local scale. The computational abilities of the network are dependent on its nonlinear dynamics, which in turn is reliant upon the sparsity of the memory vectors.

Published in:

Neural Networks, IEEE Transactions on  (Volume:17 ,  Issue: 2 )

Date of Publication:

March 2006

Need Help?


IEEE Advancing Technology for Humanity About IEEE Xplore | Contact | Help | Terms of Use | Nondiscrimination Policy | Site Map | Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest professional association for the advancement of technology.
© Copyright 2014 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.