Skip to Main Content
Hopfield neural networks (HNN) are a class of densely connected single-layer nonlinear networks of perceptrons. The network's energy function is defined through a learning procedure so that its minima coincide with states from a predefined set. However, because of the network's nonlinearity, a number of undesirable local energy minima emerge from the learning procedure. This has shown to significantly effect the network's performance. In this brief, we present a stochastic process-enhanced binary HNN. Given a fixed network topology, the desired final distribution of states can be reached by modulating the network's stochastic process. We design this process, in a computationally efficient manner, by associating it with stability intervals of the nondesired stable states of the network. Our experimental simulations confirm the predicted improvement in performance.