By Topic

Computational-complexity reduction for neural network algorithms

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Guez, A. ; Drexel Univ., Philadelphia, PA ; Kam, M. ; Eilbert, J.L.

An important class of neural models is described as a set of coupled nonlinear differential equations with state variables corresponding to the axon hillock potential of neurons. Through a nonlinear transformation, these models can be converted to an equivalent system of differential equations whose state variables correspond to firing rates. The firing rate formulation has certain computational advantages over the potential formulation of the model. The computational and storage burdens per cycle in simulations are reduced, and the resulting equations become quasilinear in a large significant subset of the state space. Moreover, the dynamic range of the state space is bounded, alleviating the numerical stability problems in network simulation. These advantages are demonstrated through an example, using the authors' model for the so-called neural solution to the traveling salesman problem proposed by J.J. Hopfield and D.W. Tank (1985)

Published in:

Systems, Man and Cybernetics, IEEE Transactions on  (Volume:19 ,  Issue: 2 )