Skip to Main Content
This note presents an analysis of the convergence rate for a projection neural network with application to constrained optimization and related problems. It is shown that the state trajectory of the projection neural network is exponentially convergent to its equilibrium point if the Jacobian matrix of the nonlinear mapping is positive definite, while the convergence rate is proportional to a design parameter if the Jacobian matrix is only positive semidefinite. Moreover, the convergence time is guaranteed to be finite if the design parameter is chosen to be sufficiently large. Furthermore, if a diagonal block of the Jacobian matrix is positive definite, then the corresponding partial state trajectory of the projection neural network is also exponentially convergent. Three optimization examples are used to show the convergence performance of the projection neural network.