Skip to Main Content
A remarkable attention has been paid in the recent past on the employment of suitable neural architectures able to work properly in non-stationary environments: the Time Variant Neural Networks (TV-NN) represent a relevant example in the field. Such kind of NNs have time variant weights, each being a linear combination of a certain set of basis functions. This inevitably increases the number of free parameters w.r.t. common feedforward architectures, resulting in an augmented complexity of the learning procedure. In this paper an Extreme Learning Machine (ELM) approach is developed with the aim of accelerating the training procedure for TV-NN, by extending the ELM approach already available for time-invariant neural structures. Some computer simulations have been carried out and related results seem to confirm the effectiveness of the idea, showing that learning time can be significantly reduced without affecting the NN performances in the testing phase.