By Topic

Paralleled hardware annealing for optimal solutions on electronic neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
B. W. Lee ; Dept. of Electr. Eng., Univ. of Southern California, Los Angeles, CA, USA ; B. J. Sheu

Three basic neural network schemes have been extensively studied by researchers: the iterative networks, the backpropagation networks, and the self-organizing networks. Simulated annealing is a probabilistic hill-climbing technique that accepts, with a nonzero but gradually decreasing probability, deterioration in the cost function of the optimization problems. Hardware annealing, which combines the simulated annealing technique with continuous-time electronic neural networks by changing the voltage gain of neurons, is discussed. The initial and final voltage gains for applying hardware annealing to Hopfield data-conversion networks are presented. In hardware annealing, the voltage gain of output neurons is increased from an initial low value to a final high value in a continuous fashion which helps to achieve the optimal solution for an optimization problem in one annealing cycle. Experimental results on the transfer function and transient response of electronic neural networks achieving the global minimum are also presented

Published in:

IEEE Transactions on Neural Networks  (Volume:4 ,  Issue: 4 )