By Topic

IEEE Transactions on Neural Networks

Issue 5 • Date Sep 1991

Filter Results

Displaying Results 1 - 9 of 9
  • Worst-case convergence times for Hopfield memories

    Publication Year: 1991, Page(s):533 - 535
    Cited by:  Papers (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (276 KB)

    The worst-case upper bound on the convergence time of Hopfield associative memories is improved to half of its previously known value. Also, the consequences of allowing `don't know' bits in both the input and the output are considered View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Stochastic competitive learning

    Publication Year: 1991, Page(s):522 - 529
    Cited by:  Papers (50)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (692 KB)

    Competitive learning systems are examined as stochastic dynamical systems. This includes continuous and discrete formulations of unsupervised, supervised, and differential competitive learning systems. These systems estimate an unknown probability density function from random pattern samples and behave as adaptive vector quantizers. Synaptic vectors, in feedforward competitive neural networks, qua... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Invariance and neural nets

    Publication Year: 1991, Page(s):498 - 508
    Cited by:  Papers (42)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1600 KB)

    Application of neural nets to invariant pattern recognition is considered. The authors study various techniques for obtaining this invariance with neural net classifiers and identify the invariant-feature technique as the most suitable for current neural classifiers. A novel formulation of invariance in terms of constraints on the feature values leads to a general method for transforming any given... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A new back-propagation algorithm with coupled neuron

    Publication Year: 1991, Page(s):535 - 538
    Cited by:  Papers (16)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (320 KB)

    A novel neuron model and its learning algorithm are presented. They provide a novel approach for speeding up convergence in the learning of layered neural networks and for training networks of neurons with a nondifferentiable output function by using the gradient descent method. The neuron is called a saturating linear coupled neuron (sl-CONE). From simulation results, it is shown that the sl-CONE... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • CMAC-based adaptive critic self-learning control

    Publication Year: 1991, Page(s):530 - 533
    Cited by:  Papers (34)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (336 KB)

    A technique that integrates the cerebellar model articulation controller (CMAC) into a self-learning control scheme developed by A.G. Barto et al. (IEEE Trans. Syst. Man., Cybern., vol.SMC-13, p.834-46, Sept./Oct. 1983) is presented. Instead of reserving one input line (as a memory) for each quantized state, the integrated technique distributively stores learned information; this reduces the requi... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Equilibrium characterization of dynamical neural networks and a systematic synthesis procedure for associative memories

    Publication Year: 1991, Page(s):509 - 521
    Cited by:  Papers (40)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1072 KB)

    Several novel results concerning the characterization of the equilibrium conditions of a continuous-time dynamical neural network model and a systematic procedure for synthesizing associative memory networks with nonsymmetrical interconnection matrices are presented. The equilibrium characterization focuses on the exponential stability and instability properties of the network equilibria and on eq... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A neural network approach to a Bayesian statistical decision problem

    Publication Year: 1991, Page(s):538 - 540
    Cited by:  Papers (9)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (212 KB)

    Generalized mean-squared error (GMSE) objective functions are proposed that can be used in neural networks to yield a Bayes optimal solution to a statistical decision problem characterized by a generic loss function View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Convergence of learning algorithms with constant learning rates

    Publication Year: 1991, Page(s):484 - 489
    Cited by:  Papers (80)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (552 KB)

    The behavior of neural network learning algorithms with a small, constant learning rate, ε, in stationary, random input environments is investigated. It is rigorously established that the sequence of weight estimates can be approximated by a certain ordinary differential equation, in the sense of weak convergence of random processes as ε tends to zero. As applications, backpropagation in... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An information criterion for optimal neural network selection

    Publication Year: 1991, Page(s):490 - 497
    Cited by:  Papers (112)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (684 KB)

    The choice of an optimal neural network design for a given problem is addressed. A relationship between optimal network design and statistical model identification is described. A derivative of Akaike's information criterion (AIC) is given. This modification yields an information statistic which can be used to objectively select a `best' network for binary classification problems. The technique ca... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

IEEE Transactions on Neural Networks is devoted to the science and technology of neural networks, which disclose significant technical knowledge, exploratory developments, and applications of neural networks from biology to software to hardware.

 

This Transactions ceased production in 2011. The current retitled publication is IEEE Transactions on Neural Networks and Learning Systems.

Full Aims & Scope