By Topic

Convergence acceleration of the Hopfield neural network by optimizing integration step sizes

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Abe, S. ; Res. Lab., Hitachi Ltd., Japan

In our previous work we have clarified global convergence of the Hopfield neural network and showed, by computer simulations, improvement of solution quality by gradually decreasing the diagonal elements of the coefficient matrix. In this paper, to accelerate convergence of the Hopfield network, at each time step the integration step size is determined dynamically so that at least one component of a variable vector reaches the surface of the hypercube. The computer simulation for the traveling salesman problem and an LSI module placement problem shows that convergence is stabilized and accelerated compared to integration by a constant step size

Published in:

Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on  (Volume:26 ,  Issue: 1 )