Skip to Main Content
In this paper, the Hopfield neural network with delay (HNND) is studied from the standpoint of regarding it as an optimized computational model. We establish a fundamental result in the theory of computation by an energy function method, and show that the discrete Hopfield neural network with delay is capable of generalizing computation for a kind of combinatorial optimization. The HNND evolution has been related to the descent to maximum value of an energy function. The new energy function proposed is related to the previous state (delay state) of the neural network, in which the energy function is able to escape from the local maximum value point by comparing different energy function values in order to obtain a global maximum value of the energy function. Furthermore, we also prove that the discrete asymmetric network with delay has a cycle of length 2 by the energy function method. It is shown that the diagonal elements of the connection matrix have an important influence on the convergence process, and they represent the relationship of the local maximum value of the energy function with the updating mode of the networks.