By Topic

Combined Convex Technique on Delay-Dependent Stability for Delayed Neural Networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Tao Li ; School of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing ; Ting Wang ; Aiguo Song ; Shumin Fei

In this brief, by employing an improved Lyapunov-Krasovskii functional (LKF) and combining the reciprocal convex technique with the convex one, a new sufficient condition is derived to guarantee a class of delayed neural networks (DNNs) to be globally asymptotically stable. Since some previously ignored terms can be considered during the estimation of the derivative of LKF, a less conservative stability criterion is derived in the forms of linear matrix inequalities, whose solvability heavily depends on the information of addressed DNNs. Finally, we demonstrate by two numerical examples that our results reduce the conservatism more efficiently than some currently used methods.

Published in:

IEEE Transactions on Neural Networks and Learning Systems  (Volume:24 ,  Issue: 9 )