Skip to Main Content
This brief deals with the problem of stability analysis for a class of recurrent neural networks (RNNs) with a time-varying delay in a range. Both delay-independent and delay-dependent conditions are derived. For the former, an augmented Lyapunov functional is constructed and the derivative of the state is retained. Since the obtained criterion realizes the decoupling of the Lyapunov function matrix and the coefficient matrix of the neural networks, it can be easily extended to handle neural networks with polytopic uncertainties. For the latter, a new type of delay-range-dependent condition is proposed using the free-weighting matrix technique to obtain a tighter upper bound on the derivative of the Lyapunov-Krasovskii functional. Two examples are given to illustrate the effectiveness and the reduced conservatism of the proposed results.