Skip to Main Content
We focus on the development of an efficient method for estimating the parameters of continuous dynamic synapse neural networks (cDSNN). We implement higher order differential equations in the cDSNN, necessitating a minor adjustment to the cDSNN architecture. The estimation of network parameters is based on extension of the quasi-linearization algorithm, which provides an explicit analytic representation for the solution of a nonlinear differential equation. We use higher order cDSNNs trained with the extended quasilinearization algorithm to the isolated word recognition task. The features derived from cDSNNs are classified using a HMM based classifier. We show that cDSNN based features are more robust in the presence of additive Gaussian white noise than state of-the-art Mel frequency features.