By Topic

Improved Delay-Dependent Stability Condition of Discrete Recurrent Neural Networks With Time-Varying Delays

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Zhengguang Wu ; Nat. Lab. of Ind. Control Technol., Zhejiang Univ., Hangzhou, China ; Hongye Su ; Jian Chu ; Wuneng Zhou

This brief investigates the problem of global exponential stability analysis for discrete recurrent neural networks with time-varying delays. In terms of linear matrix inequality (LMI) approach, a novel delay-dependent stability criterion is established for the considered recurrent neural networks via a new Lyapunov function. The obtained condition has less conservativeness and less number of variables than the existing ones. Numerical example is given to demonstrate the effectiveness of the proposed method.

Published in:

Neural Networks, IEEE Transactions on  (Volume:21 ,  Issue: 4 )