By Topic

Comments on "New conditions for global stability of neural networks with application to linear and quadratic programming problems"

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Xue-Bin Liang ; Dept. of Comput. Sci., Fudan Univ., Shanghai, China ; Li-De Wu

For original paper, see M. Forti and A. Tesi, ibid., vol. 42, pp. 354-66 (1995). This letter makes the following comments: 1) the assumption of all neuron activation functions to vanish at the origin, which is utilized in the proof of the result implying the existence and uniqueness of the network equilibrium point, can be actually omitted; 2) in the infinite sector case, the result of global asymptotic stability (GAS) remains true with respect to the class of increasing (not necessarily strictly) activations, as in the finite sector case. Consequently, a result about absolute stability (ABST) of neural networks, which can represent a generalization of the existing related ones, is also obtained.

Published in:

Circuits and Systems I: Fundamental Theory and Applications, IEEE Transactions on  (Volume:44 ,  Issue: 11 )