By Topic

Neural Networks, IEEE Transactions on

Issue 2 • Date Jun 1990

Filter Results

Displaying Results 1 - 11 of 11
  • Graph partitioning using annealed neural networks

    Publication Year: 1990 , Page(s): 192 - 203
    Cited by:  Papers (66)  |  Patents (4)
    Request Permissions | Click to expandAbstract | PDF file iconPDF (1116 KB)  

    A new algorithm, mean field annealing (MFA), is applied to the graph-partitioning problem. The MFA algorithm combines characteristics of the simulated-annealing algorithm and the Hopfield neural network. MFA exhibits the rapid convergence of the neural network while preserving the solution quality afforded by simulated annealing (SA). The rate of convergence of MFA on graph bipartitioning problems... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A novel objective function for improved phoneme recognition using time-delay neural networks

    Publication Year: 1990 , Page(s): 216 - 228
    Cited by:  Papers (50)
    Request Permissions | Click to expandAbstract | PDF file iconPDF (1180 KB)  

    Single-speaker and multispeaker recognition results are presented for the voice-stop consonants /b,d,g/ using time-delay neural networks (TDNNs) with a number of enhancements, including a new objective function for training these networks. The new objective function, called the classification figure of merit (CFM), differs markedly from the traditional mean-squared-error (MSE) objective function a... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A theoretical investigation into the performance of the Hopfield model

    Publication Year: 1990 , Page(s): 204 - 215
    Cited by:  Papers (127)
    Request Permissions | Click to expandAbstract | PDF file iconPDF (940 KB)  

    An analysis is made of the behavior of the Hopfield model as a content-addressable memory (CAM) and as a method of solving the traveling salesman problem (TSP). The analysis is based on the geometry of the subspace set up by the degenerate eigenvalues of the connection matrix. The dynamic equation is shown to be equivalent to a projection of the input vector onto this subspace. In the case of cont... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Trellis codes, receptive fields, and fault tolerant, self-repairing neural networks

    Publication Year: 1990 , Page(s): 154 - 166
    Cited by:  Papers (13)  |  Patents (1)
    Request Permissions | Click to expandAbstract | PDF file iconPDF (1348 KB)  

    Relationships between locally interconnected neural networks that use receptive field representations and trellis or convolutional codes are explored. A fault tolerant neural network is described. It is patterned after the trellis graph description of convolutional codes and is able to tolerate errors in its inputs and failures of constituent neurons. This network incorporates learning, which adds... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Standardization of neural network terminology

    Publication Year: 1990 , Page(s): 244 - 245
    Cited by:  Papers (2)
    Request Permissions | Click to expandAbstract | PDF file iconPDF (184 KB)  

    Outlined are the initial activities of an ad hoc standards committee established by the IEEE Neural Networks Council to pursue this effort. A proposed list of frequently used terms to be considered by the committee is presented. Several proposed definitions are presented View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Perceptron-based learning algorithms

    Publication Year: 1990 , Page(s): 179 - 191
    Cited by:  Papers (75)  |  Patents (1)
    Request Permissions | Click to expandAbstract | PDF file iconPDF (1120 KB)  

    A key task for connectionist research is the development and analysis of learning algorithms. An examination is made of several supervised learning algorithms for single-cell and network models. The heart of these algorithms is the pocket algorithm, a modification of perceptron learning that makes perceptron learning well-behaved with nonseparable training data, even if the data are noisy and cont... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A simple procedure for pruning back-propagation trained neural networks

    Publication Year: 1990 , Page(s): 239 - 242
    Cited by:  Papers (138)  |  Patents (10)
    Request Permissions | Click to expandAbstract | PDF file iconPDF (352 KB)  

    The sensitivity of the global error (cost) function to the inclusion/exclusion of each synapse in the artificial neural network is estimated. Introduced are shadow arrays which keep track of the incremental changes to the synaptic weights during a single pass of back-propagating learning. The synapses are then ordered by decreasing sensitivity numbers so that the network can be efficiently pruned ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Derivation of a class of training algorithms

    Publication Year: 1990 , Page(s): 229 - 232
    Cited by:  Papers (22)
    Request Permissions | Click to expandAbstract | PDF file iconPDF (332 KB)  

    A novel derivation is presented of T. Kohonen's topographic mapping training algorithm (Self-Organization and Associative Memory, 1984), based upon an extension of the Linde-Buzo-Gray (LBG) algorithm for vector quantizer design. Thus a vector quantizer is designed by minimizing an L2 reconstruction distortion measure, including an additional contribution from the effect of code... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Parallel, self-organizing, hierarchical neural networks

    Publication Year: 1990 , Page(s): 167 - 178
    Cited by:  Papers (35)  |  Patents (10)
    Request Permissions | Click to expandAbstract | PDF file iconPDF (904 KB)  

    A new neural-network architecture called the parallel, self-organizing, hierarchical neural network (PSHNN) is presented. The new architecture involves a number of stages in which each stage can be a particular neural network (SNN). At the end of each stage, error detection is carried out, and a number of input vectors are rejected. Between two stages there is a nonlinear transformation of input v... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Learning of stable states in stochastic asymmetric networks

    Publication Year: 1990 , Page(s): 233 - 238
    Cited by:  Papers (4)
    Request Permissions | Click to expandAbstract | PDF file iconPDF (556 KB)  

    Boltzmann-based models with asymmetric connections are investigated. Although they are initially unstable, these networks spontaneously self-stabilize as a result of learning. Moreover, pairs of weights symmetrize during learning; however, the symmetry is not enough to account for the observed stability. To characterize the system it is useful to consider how its entropy is affected by learning an... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Neural networks for control systems

    Publication Year: 1990 , Page(s): 242 - 244
    Cited by:  Papers (55)
    Request Permissions | Click to expandAbstract | PDF file iconPDF (340 KB)  

    A description is given of 11 papers from the April 1990 special issue on neural networks in control systems of IEEE Control Systems Magazine. The emphasis was on presenting as varied and current a picture as possible of the use of neural networks in control. The papers described cover: the design of associative memories using feedback neural networks; a method to use neural networks to control hig... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

IEEE Transactions on Neural Networks is devoted to the science and technology of neural networks, which disclose significant technical knowledge, exploratory developments, and applications of neural networks from biology to software to hardware.

 

This Transactions ceased production in 2011. The current retitled publication is IEEE Transactions on Neural Networks and Learning Systems.

Full Aims & Scope