Skip to Main Content
Considerable research into the training of neural networks by the backpropagation technique has been undertaken in recent years. Introduction of a momentum term into the training equation can accelerate the training process. A new momentum step and a scheme for dynamically selecting the momentum rate are described. It is shown that these give improved acceleration of training and strong global convergence characteristics. Results are presented for four benchmark training tasks.