By Topic

Global output convergence of a class of continuous-time recurrent neural networks with time-varying thresholds

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Derong Liu ; Dept. of Electr. & Comput. Eng., Univ. of Illinois, Chicago, IL, USA ; Sanqing Hu ; Jun Wang

This paper discusses the global output convergence of a class of continuous-time recurrent neural networks (RNNs) with globally Lipschitz continuous and monotone nondecreasing activation functions and locally Lipschitz continuous time-varying thresholds. We establish one sufficient condition to guarantee the global output convergence of this class of neural networks. The present result does not require symmetry in the connection weight matrix. The convergence result is useful in the design of recurrent neural networks with time-varying thresholds.

Published in:

Circuits and Systems II: Express Briefs, IEEE Transactions on  (Volume:51 ,  Issue: 4 )