Skip to Main Content
This paper presents two novel alternative families of error functions as the generalized training criterion of feedforward neural networks; they can significantly accelerate the convergence rate in the midterm and the last training stages. Their training speed is faster than the original fast backpropagation algorithm by parameter optimization. Several approaches to parameter optimization are explored and verified by experiments.
Signal Processing, 2002 6th International Conference on (Volume:2 )
Date of Conference: 26-30 Aug. 2002