Issue 6 • Date Nov 1991
Cited by: Papers (481) | Patents (40)
A memory-based network that provides estimates of continuous variables and converges to the underlying (linear or nonlinear) regression surface is described. The general regression neural network (GRNN) is a one-pass learning algorithm with a highly parallel structure. It is shown that, even with sparse data in a multidimensional measurement space, the algorithm provides smooth transitions from on... View full abstract»
Cited by: Papers (6)
A closed-form solution for improved pattern recognition that reduces the training time to a single epoch (one presentation of each of the training patterns) is presented. It is shown that the corresponding hardware requirements are no greater than those for regular recognition under certain conditions. A simple example which shows that the generalization obtained with the closed-form method exceed... View full abstract»
Cited by: Papers (38)
A classifier that incorporates both preprocessing and postprocessing procedures as well as a multilayer feedforward network (based on the back-propagation algorithm) in its design to distinguish between several major classes of radar returns including weather, birds, and aircraft is described. The classifier achieves an average classification accuracy of 89% on generalization for data collected du... View full abstract»
Cited by: Papers (37) | Patents (1)
Necessary and sufficient conditions are derived for the weights of a generalized correlation matrix of a bidirectional associative memory (BAM) which guarantee the recall of all training pairs. A linear programming/multiple training (LP/MT) method that determines weights which satisfy the conditions when a solution is feasible is presented. The sequential multiple training (SMT) method is shown to... View full abstract»
Cited by: Papers (23)
The relationship between the number of hidden nodes in a neural network, the complexity of a multiclass discrimination problem, and the number of samples needed for effect learning are discussed. Bounds for the number of samples needed for effect learning are given. It is shown that Ω(min (
d, n) M) boundary samples are required for successful classification of ... View full abstract»
Aims & Scope
IEEE Transactions on Neural Networks is devoted to the science and technology of neural networks, which disclose significant technical knowledge, exploratory developments, and applications of neural networks from biology to software to hardware.
This Transactions ceased production in 2011. The current retitled publication is IEEE Transactions on Neural Networks and Learning Systems.