Skip to Main Content
In this paper, the dynamical behavior of the basic node used for constructing Hebbian artificial neural networks (NNs) is analyzed. Hebbian NNs are employed in communications and signal processing applications, among others. They have been traditionally studied on a continuous-time formulation whose validity is justified via some analytical procedures that presume, among other hypotheses, a specific asymptotic behavior of the learning gain. The main contribution of this paper is the study of a deterministic discrete-time (DDT) formulation that characterizes the average evolution of the node, preserving the discrete-time form of the original network and gathering a more realistic behavior of the learning gain. The new deterministic discrete-time model provides some unstability results (critical for the case of large similar variance signals) which are drastically different to the ones known for the continuous-time formulation. Simulation examples support the presented results, illustrating the practical limitations of the basic Hebbian model.