Skip to Main Content
The learning progresses of the conventional algorithms for multilayer feedforward neural networks such as the momentum algorithm and the Delta-bar-Delta (DBD) algorithm are studied by examining their learning trajectories on the error surfaces. This study explains the stagnation of convergence empirically observed in the learning progresses of the conventional algorithms. Also a new learning algorithm for multilayer feedforward neural networks is proposed. The proposed algorithm adaptively updates learning rates and momentum coefficients of the momentum algorithm, according to time change of a cost function. It is motivated from the novel shape of the error surfaces. Results of computer simulations show that the new algorithm outperforms the conventional ones.
Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on (Volume:1 )
Date of Conference: 25-29 Oct. 1993