By Topic

Global stability analysis of discrete-time recurrent neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Barabanov, N.E. ; St. Petersburg State Electrotechnical Univ., Russia ; Prokhorov, D.V.

We address the problem of Lyapunov stability of discrete-time recurrent neural networks (RNN). We assume that network weights are fixed. Based on classical results of the theory of absolute stability, we propose a new approach to stability analysis of RNN with sector-type monotone nonlinearities. We devise a simple state space transformation to convert the original RNN equations to a form suitable for our stability analysis. We then write appropriate linear matrix inequalities (LMI) to be solved to determine whether the RNN is globally exponentially stable. Unlike previous treatments, our approach naturally permits to account for nonzero biases usually present in RNN for improved approximation capabilities. We illustrate how to use our approach with an example

Published in:

American Control Conference, 2001. Proceedings of the 2001  (Volume:6 )

Date of Conference:

2001