By Topic

Proceedings of the IEEE

Issue 10 • Date Oct 1990

Filter Results

Displaying Results 1 - 17 of 17
  • CMAS: an associative neural network alternative to backpropagation

    Publication Year: 1990, Page(s):1561 - 1567
    Cited by:  Papers (184)  |  Patents (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (948 KB)

    The CMAC (cerebellar model arithmetic computer) neural network, an alternative to backpropagated multilayer networks, is described. The following advantages of CMAC are discussed: local generalization, rapid algorithmic computation based on LMS (least-mean-square) training, incremental training, functional representation, output superposition, and a fast practical hardware realization. A geometric... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Maximum a posteriori decision and evaluation of class probabilities by Boltzmann perceptron classifiers

    Publication Year: 1990, Page(s):1620 - 1628
    Cited by:  Papers (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (620 KB)

    It is shown that neural network architectures may offer a valuable alternative to the Bayesian classifier. With neural networks, the a posteriori probabilities are computed with no a priori assumptions about the probability distribution functions (PDFs) that generate the data. Rather than assuming certain types of PDFs for the input data, the neural classifier uses a general type of input-output m... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Backpropagation through time: what it does and how to do it

    Publication Year: 1990, Page(s):1550 - 1560
    Cited by:  Papers (574)  |  Patents (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (936 KB)

    Basic backpropagation, which is a simple method now being widely used in areas like pattern recognition and fault diagnosis, is reviewed. The basic equations for backpropagation through time, and applications to areas like pattern recognition involving dynamic systems, systems identification, and control are discussed. Further extensions of this method, to deal with systems other than neural netwo... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A performance comparison of trained multilayer perceptrons and trained classification trees

    Publication Year: 1990, Page(s):1614 - 1619
    Cited by:  Papers (66)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (604 KB)

    The important differences between multilayer perceptrons and classification trees are considered. A number of empirical tests on three real-world problems in power-system load forecasting, power-system security prediction, and speaker-independent vowel recognition are presented. The load-forecasting problem, which is partially a regression problem, uses past trends to predict the critical needs of... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Ground states of partially connected binary neural networks

    Publication Year: 1990, Page(s):1575 - 1578
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (252 KB)

    Neural networks defined by outer products of vectors over {-1, 0, 1} are considered. Patterns over {-1, 0, 1} define by their outer products partially connected neural networks consisting of internally strongly connected externally weakly connected subnetworks. Subpatterns over {-1, 1} define subnetworks, and their combinations that agree in the common bits define permissible words. It is shown th... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Neural computation of arithmetic functions

    Publication Year: 1990, Page(s):1669 - 1675
    Cited by:  Papers (39)  |  Patents (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (548 KB)

    A neuron is modeled as a linear threshold gate, and the network architecture considered is the layered feedforward network. It is shown how common arithmetic functions such as multiplication and sorting can be efficiently computed in a shallow neural network. Some known results are improved by showing that the product of two n-bit numbers and sorting of n n-bit numbers c... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Constructive approximations for neural networks by sigmoidal functions

    Publication Year: 1990, Page(s):1586 - 1589
    Cited by:  Papers (29)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (164 KB)

    A constructive algorithm for uniformly approximating real continuous mappings by linear combinations of bounded sigmoidal functions is given. G. Cybenko (1989) has demonstrated the existence of uniform approximations to any continuous f provided that σ is continuous; the proof is nonconstructive, relying on the Hahn-Branch theorem and the dual characterization of C(I... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Holographic implementation of a fully connected neural network

    Publication Year: 1990, Page(s):1637 - 1645
    Cited by:  Papers (19)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (712 KB)

    A holographic implementation of a fully connected neural network is presented. This model has a simple structure and is relatively easy to implement, and its operating principles and characteristics can be extended to other types of networks, since any architecture can be considered as a fully connected network with some of its connections missing. The basic principles of the fully connected netwo... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Entropy nets: from decision trees to neural networks

    Publication Year: 1990, Page(s):1605 - 1613
    Cited by:  Papers (78)  |  Patents (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (952 KB)

    How the mapping of decision trees into a multilayer neural network structure can be exploited for the systematic design of a class of layered neural networks, called entropy nets (which have far fewer connections), is shown. Several important issues such as the automatic tree generation, incorporation of the incremental learning, and the generalization of knowledge acquired during the tree design ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A statistical approach to learning and generalization in layered neural networks

    Publication Year: 1990, Page(s):1568 - 1574
    Cited by:  Papers (114)  |  Patents (10)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (596 KB)

    A general statistical description of the problem of learning from examples is presented. Learning in layered networks is posed as a search in the network parameter space for a network that minimizes an additive error function of a statistically independent examples. By imposing the equivalence of the minimum error and the maximum likelihood criteria for training the network, the Gibbs distribution... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Neural network models of sensory integration for improved vowel recognition

    Publication Year: 1990, Page(s):1658 - 1668
    Cited by:  Papers (28)  |  Patents (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1144 KB)

    It is demonstrated that multiple sources of speech information can be integrated at a subsymbolic level to improve vowel recognition. Feedforward and recurrent neural networks are trained to estimate the acoustic characteristics of a vocal tract from images of the speaker's mouth. These estimates are then combined with the noise-degraded acoustic information, effectively increasing the signal-to-n... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the convergence properties of the Hopfield model

    Publication Year: 1990, Page(s):1579 - 1585
    Cited by:  Papers (61)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (540 KB)

    The main contribution of the present work is showing that the known convergence properties of the Hopfield model can be reduced to a very simple case, for which an elementary proof is provided. The convergence properties of the Hopfield model are dependent on the structure of the interconnections matrix W and the method by which the nodes are updated. Three cases are known: (1) convergenc... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Neuromorphic electronic systems

    Publication Year: 1990, Page(s):1629 - 1636
    Cited by:  Papers (337)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (756 KB)

    It is shown that for many problems, particularly those in which the input data are ill-conditioned and the computation can be specified in a relative manner, biological solutions are many orders of magnitude more effective than those using digital methods. This advantage can be attributed principally to the use of elementary physical phenomena as computational primitives, and to the representation... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On the decision regions of multilayer perceptrons

    Publication Year: 1990, Page(s):1590 - 1594
    Cited by:  Papers (42)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (376 KB)

    The capabilities of two-layer perceptrons are examined with respect to the geometric properties of the decision regions they are able to form. It is known that two-layer perceptrons can form decision regions which are nonconvex and even disconnected, though the extent of their capabilities in comparison to three-layer structures is not well understood. By relating the geometry of arrangements of h... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Radar signal categorization using a neural network

    Publication Year: 1990, Page(s):1646 - 1657
    Cited by:  Papers (36)  |  Patents (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (948 KB)

    Neural networks are used to analyze a complex simulated radar environment which contains noisy radar pulses generated by many different emitters. The neural network used is an energy-minimizing network. The limiting process contains the state vector within a set of limits, and this model is called the brain state in a box, or BSB model, which forms energy minima (attractors in the network dynamica... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Nearest neighbor pattern classification perceptrons

    Publication Year: 1990, Page(s):1595 - 1598
    Cited by:  Papers (35)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (280 KB)

    A three-layer perceptron that uses the nearest-neighbor pattern classification rule is presented. This neural network is of interest because it is designed specifically for the set of training patterns, and incorporating of the training of the network into the design eliminates the need for the use of training algorithms. The technique therefore provides an alternative to the limitations and unpre... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Convergence properties and stationary points of a perceptron learning algorithm

    Publication Year: 1990, Page(s):1599 - 1604
    Cited by:  Papers (28)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (408 KB)

    An analysis of the stationary (convergence) points of an adaptive algorithm that adjusts the perceptron weights is presented. This algorithm is identical in form to the least-mean-square (LMS) algorithm, except that a hard limiter is incorporated at the output of the summer. The algorithm is described in detail, a simple two-input example is presented, and some of its convergence properties are il... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

The most highly-cited general interest journal in electrical engineering and computer science, the Proceedings is the best way to stay informed on an exemplary range of topics.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
H. Joel Trussell
North Carolina State University