Skip to Main Content
The paper presents the design of three types of neural networks with different features, including traditional backpropagation networks, radial basis function networks and counterpropagation networks. Traditional backpropagation networks require very complex training process before being applied for classification or approximation. Radial basis function networks simplify the training process by the specially organized 3-layer architecture. Counterpropagation networks do not need training process at all and can be designed directly by extracting all the parameters from input data. Both design complexity and generalization ability of the three types of neural network architectures are compared, based on a digit image recognition problem.