By Topic

IEEE Transactions on Neural Networks

Issue 3 • Date May 1991

Filter Results

Displaying Results 1 - 6 of 6
  • A multilayer neural network with piecewise-linear structure and back-propagation learning

    Publication Year: 1991, Page(s):395 - 403
    Cited by:  Papers (34)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (724 KB)

    A multilayer neural network which is given a two-layer piecewise-linear structure for every cascaded section is proposed. The neural networks have nonlinear elements that are neither sigmoidal nor of a signum type. Each nonlinear element is an absolute value operator. It is almost everywhere differentiable, which makes back-propagation feasible in a digital setting. Both the feedforward signal pro... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A general weight matrix formulation using optimal control

    Publication Year: 1991, Page(s):378 - 394
    Cited by:  Papers (13)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1212 KB)

    Classical methods from optimal control theory are used in deriving general forms for neural network weights. The network learning or application task is encoded in a performance index of a general structure. Consequently, different instances of this performance index lead to special cases of weight rules, including some well-known forms. Comparisons are made with the outer product rule, spectral m... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A neural network approach to statistical pattern classification by `semiparametric' estimation of probability density functions

    Publication Year: 1991, Page(s):366 - 377
    Cited by:  Papers (114)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1136 KB)

    A method for designing near-optimal nonlinear classifiers, based on a self-organizing technique for estimating probability density functions when only weak assumptions are made about the densities, is described. The method avoids disadvantages of other existing methods by parametrizing a set of component densities from which the actual densities are constructed. The parameters of the component den... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Handwritten alphanumeric character recognition by the neocognitron

    Publication Year: 1991, Page(s):355 - 365
    Cited by:  Papers (146)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (896 KB)

    A pattern recognition system which works with the mechanism of the neocognitron, a neural network model for deformation-invariant visual pattern recognition, is discussed. The neocognition was developed by Fukushima (1980). The system has been trained to recognize 35 handwritten alphanumeric characters. The ability to recognize deformed characters correctly depends strongly on the choice of the tr... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An adaptively trained neural network

    Publication Year: 1991, Page(s):334 - 345
    Cited by:  Papers (47)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (788 KB)

    A training procedure that adapts the weights of a trained layered perceptron artificial neural network to training data originating from a slowly varying nonstationary process is proposed. The resulting adaptively trained neural network (ATNN), based on nonlinear programming techniques, is shown to adapt to new training data that are in conflict with earlier training data without affecting the neu... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fast training algorithms for multilayer neural nets

    Publication Year: 1991, Page(s):346 - 354
    Cited by:  Papers (90)  |  Patents (3)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (788 KB)

    An algorithm that is faster than back-propagation and for which it is not necessary to specify the number of hidden units in advance is described. The relationship with other fast pattern-recognition algorithms, such as algorithms based on k-d trees, is discussed. The algorithm has been implemented and tested on artificial problems, such as the parity problem, and on real problems arising in speec... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

IEEE Transactions on Neural Networks is devoted to the science and technology of neural networks, which disclose significant technical knowledge, exploratory developments, and applications of neural networks from biology to software to hardware.

 

This Transactions ceased production in 2011. The current retitled publication is IEEE Transactions on Neural Networks and Learning Systems.

Full Aims & Scope