Skip to Main Content
This paper presents a new recurrent dynamic neural network to solve signal analysis and processing problems. The neural network is essentially composed of feedback-type connections and arrays of integrators, linear gains, and nonlinear activation functions. By seeking a minimum global energy state, the network solves for the best set of representation coefficients required to model a given signal in terms of suitable elementary basis signals. An analytical model of the recurrent neural network is obtained through discretization of the integrator blocks and linearization of the activation function. Continuity of the algorithm when segment boundaries are crossed is accomplished by varying the slope of the linearized activation function. The proposed approach results in a closed analytical form of the recurrent neural network solution. The perceived advantages of using the network are estimation of robustness, prediction of convergence by examining the eigenvalues of the analytical state matrix, and increase of computational speed. Moreover, unlike traditional numerical methods, the new approach offers the possibility of handling time-varying signals with uncertainties.