Skip to Main Content
We address the problem of global Lyapunov stability of discrete-time recurrent multilayer neural networks (RMLNN) in the unforced (unperturbed) setting. It is assumed that network weights are fixed to some values, for example, those attained after training. To apply the method of reduction of attractor estimate, we use the state space extension method to present RMLNN in the form of discrete-time dynamical system. We describe also a new algorithm for checking the global asymptotic stability of RMLNN, which is also based on the method of reduction of attractor estimate, and is much better from the computational viewpoint. An example shows the efficiency of this new algorithm.