By Topic

Delay-Derivative-Dependent Stability for Delayed Neural Networks With Unbound Distributed Delay

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Tao Li ; Sch. of Instrum. Sci. & Eng., Southeast Univ., Nanjing, China ; Aiguo Song ; Shumin Fei ; Ting Wang

In this brief, based on Lyapunov-Krasovskii functional approach and appropriate integral inequality, a new sufficient condition is derived to guarantee the global stability for delayed neural networks with unbounded distributed delay, in which the improved delay-partitioning technique and general convex combination are employed. The LMI-based criterion heavily depends on both the upper and lower bounds on time delay and its derivative, which is different from the existent ones and has wider application fields than some present results. Finally, three numerical examples can illustrate the efficiency of the new method based on the reduced conservatism which can be achieved by thinning the delay interval.

Published in:

Neural Networks, IEEE Transactions on  (Volume:21 ,  Issue: 8 )