Skip to Main Content
The dynamic decay adjustment (DDA) algorithm is a fast constructive algorithm for training RBF and PNN neural networks. The algorithm has two parameters, namely, θ+ and θ-. The papers which introduced DDA argued that those parameters would not heavily influence classification performance and therefore they recommended using always the default values of these parameters. In contrast, this paper shows that smaller values of parameter θ can, for a considerable number of datasets, result in remarkable improvement in generalization performance.
Date of Conference: 6-9 Nov. 2005