IEEE Transactions on Neural Networks

Issue 6 • Nov. 1996

Filter Results

Displaying Results 1 - 25 of 32
  • Corrections to "Image Recovery and Segmentation Using Competitive Learning in a Layered Network

    Publication Year: 1996
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (109 KB)

    First Page of the Article
    View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Self-Organizing Maps [Book Reviews]

    Publication Year: 1996
    Request permission for commercial reuse | |PDF file iconPDF (268 KB)
    Freely Available from IEEE
  • Neural Adaptive Control Technology [Books in Brief]

    Publication Year: 1996
    Request permission for commercial reuse | |PDF file iconPDF (128 KB)
    Freely Available from IEEE
  • Mathematical Perspectives on Neural Networks [Books in Brief]

    Publication Year: 1996
    Request permission for commercial reuse | |PDF file iconPDF (128 KB)
    Freely Available from IEEE
  • 1996 Index IEEE Transactions on Neural Networks Vol. 7

    Publication Year: 1996
    Request permission for commercial reuse | |PDF file iconPDF (2307 KB)
    Freely Available from IEEE
  • Complex-valued multistate neural associative memory

    Publication Year: 1996, Page(s):1491 - 1496
    Cited by:  Papers (181)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (488 KB)

    A model of a multivalued associative memory is presented. This memory has the form of a fully connected attractor neural network composed of multistate complex-valued neurons. Such a network is able to perform the task of storing and recalling gray-scale images. It is also shown that the complex-valued fully connected neural network may be considered as a generalization of a Hopfield network conta... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Computational capabilities of local-feedback recurrent networks acting as finite-state machines

    Publication Year: 1996, Page(s):1521 - 1525
    Cited by:  Papers (17)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (480 KB)

    In this paper we explore the expressive power of recurrent networks with local feedback connections for symbolic data streams. We rely on the analysis of the maximal set of strings that can be shattered by the concept class associated to these networks (i.e. strings that can be arbitrarily classified as positive or negative), and find that their expressive power is inherently limited, since there ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A new neural network for solving linear and quadratic programming problems

    Publication Year: 1996, Page(s):1544 - 1548
    Cited by:  Papers (147)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (336 KB)

    A new neural network for solving linear and quadratic programming problems is presented and is shown to be globally convergent. The new neural network improves existing neural networks for solving these problems: it avoids the parameter turning problem, it is capable of achieving the exact solutions, and it uses only simple hardware in which no analog multipliers for variables are required. Furthe... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A generalized learning paradigm exploiting the structure of feedforward neural networks

    Publication Year: 1996, Page(s):1450 - 1460
    Cited by:  Papers (54)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (988 KB)

    In this paper a general class of fast learning algorithms for feedforward neural networks is introduced and described. The approach exploits the separability of each layer into linear and nonlinear blocks and consists of two steps. The first step is the descent of the error functional in the space of the outputs of the linear blocks (descent in the neuron space), which can be performed using any p... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A neural network-based model for paper currency recognition and verification

    Publication Year: 1996, Page(s):1482 - 1490
    Cited by:  Papers (51)  |  Patents (7)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (1056 KB)

    This paper describes the neural-based recognition and verification techniques used in a banknote machine, recently implemented for accepting paper currency of different countries. The perception mechanism is based on low-cost optoelectronic devices which produce a signal associated with the light refracted by the banknotes. The classification and verification steps are carried out by a society of ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Learning capacity and sample complexity on expert networks

    Publication Year: 1996, Page(s):1517 - 1520
    Cited by:  Papers (11)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (380 KB)

    A major development in knowledge-based neural networks is the integration of symbolic expert rule-based knowledge into neural networks, resulting in so-called rule-based neural (or connectionist) networks. An expert network here refers to a particular construct in which the uncertainty management model of symbolic expert systems is mapped into the activation function of the neural network. This pa... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An argument for abandoning the travelling salesman problem as a neural-network benchmark

    Publication Year: 1996, Page(s):1542 - 1544
    Cited by:  Papers (19)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (348 KB)

    In this paper, a distinction is drawn between research which assesses the suitability of the Hopfield network for solving the travelling salesman problem (TSP) and research which attempts to determine the effectiveness of the Hopfield network as an optimization technique. It is argued that the TSP is generally misused as a benchmark for the latter goal, with the existence of an alternative linear ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Adaptive associative memories capable of pattern segmentation

    Publication Year: 1996, Page(s):1439 - 1449
    Cited by:  Papers (7)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (952 KB)

    This paper presents an adaptive type of associative memory (AAM) that can separate patterns from composite inputs which might be degraded by deficiency or noise and that can recover incomplete or noisy single patterns. The behavior of AAM is analyzed in terms of stability, giving the stable solutions (results of recall), and the recall of spurious memories (the undesired solutions) is shown to be ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Incremental learning of complex temporal patterns

    Publication Year: 1996, Page(s):1465 - 1481
    Cited by:  Papers (52)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (1516 KB)

    A neural model for temporal pattern generation is used and analyzed for training with multiple complex sequences in a sequential manner. The network exhibits some degree of interference when new sequences are acquired. It is proven that the model is capable of incrementally learning a finite number of complex sequences. The model is then evaluated with a large set of highly correlated sequences. W... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Improving convergence and solution quality of Hopfield-type neural networks with augmented Lagrange multipliers

    Publication Year: 1996, Page(s):1507 - 1516
    Cited by:  Papers (44)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (704 KB)

    Hopfield-type networks convert a combinatorial optimization to a constrained real optimization and solve the latter using the penalty method. There is a dilemma with such networks: when tuned to produce good-quality solutions, they can fail to converge to valid solutions; and when tuned to converge, they tend to give low-quality solutions. This paper proposes a new method, called the augmented Lag... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An asynchronous distributed architecture model for the Boltzmann machine control mechanism

    Publication Year: 1996, Page(s):1538 - 1541
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (452 KB)

    We present a study addressing a hardware implementation of the Boltzmann machine that relies on the concept of asynchronous digital system. The constraint of concurrently switching only unconnected neurons is dynamically satisfied by using an asynchronous distributed control mechanism. The design of the control architecture is derived from a formal definition of the problem by means of the trace t... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An analysis of noise in recurrent neural networks: convergence and generalization

    Publication Year: 1996, Page(s):1424 - 1438
    Cited by:  Papers (40)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (1232 KB)

    Concerns the effect of noise on the performance of feedforward neural nets. We introduce and analyze various methods of injecting synaptic noise into dynamically driven recurrent nets during training. Theoretical results show that applying a controlled amount of noise during training may improve convergence and generalization performance. We analyze the effects of various noise parameters and pred... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Lagrangian relaxation network for graph matching

    Publication Year: 1996, Page(s):1365 - 1381
    Cited by:  Papers (24)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (1320 KB)

    A Lagrangian relaxation network for graph matching is presented. The problem is formulated as follows: given graphs G and g, find a permutation matrix M that brings the two sets of vertices into correspondence. Permutation matrix constraints are formulated in the framework of deterministic annealing. Our approach is in the same spirit as a Lagrangian decomposition approach in that the row and colu... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Exploring and comparing the best “direct methods” for the efficient training of MLP-networks

    Publication Year: 1996, Page(s):1497 - 1502
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (524 KB)

    It is well known that the main difficulties of the algorithms based on backpropagation are the susceptibility to local minima and the slow adaptivity to the patterns during the training. In this paper, we present a class of algorithms, which overcome the above difficulties by utilizing some “direct” numerical methods for the computation of the matrices of weights. In particular, we inv... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Genetic evolution of radial basis function coverage using orthogonal niches

    Publication Year: 1996, Page(s):1525 - 1528
    Cited by:  Papers (24)  |  Patents (6)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (448 KB)

    A well-performing set of radial basis functions (RBFs) can emerge from genetic competition among individual RBFs. Genetic selection of the individual RBFs is based on credit sharing which localizes competition within orthogonal niches. These orthogonal niches are derived using singular value decomposition and are used to apportion credit for the overall performance of the RBF network among individ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Multiplication-free radial basis function network

    Publication Year: 1996, Page(s):1461 - 1464
    Cited by:  Papers (7)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (400 KB)

    For the purpose of adaptive function approximation, a new radial basis function network is proposed which is nonlinear in its parameters. The goal is to reduce significantly the computational effort for a serial processor, by avoiding multiplication in both the evaluation of the function model and the computation of the parameter adaptation. The approximation scheme makes use of a grid-based Gauss... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Oscillatory and chaotic dynamics in neural networks under varying operating conditions

    Publication Year: 1996, Page(s):1382 - 1388
    Cited by:  Papers (38)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (2220 KB)

    This paper studies the effects of a time-dependent operating environment on the dynamics of a neural network. In the previous paper Wang et al. (1990) studied an exactly solvable model of a higher order neural network. We identified a bifurcation parameter for the system, i.e., the rescaled noise level, which represents the combined effects of incomplete connectivity, interference among stored pat... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Error-minimizing dead zone for basis function networks

    Publication Year: 1996, Page(s):1503 - 1506
    Cited by:  Papers (17)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (340 KB)

    The incorporation of dead zones in the error signal of basis function networks avoids the networks' overtraining and guarantees the convergence of the normalized least mean square (LMS) algorithm and related algorithms. A new so-called error-minimizing dead zone is presented providing the least a posteriori error out of the set of all convergence assuring dead zones. A general convergence proof is... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Effect of probabilistic inputs on neural network-based electric load forecasting

    Publication Year: 1996, Page(s):1528 - 1532
    Cited by:  Papers (27)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (416 KB)

    This paper presents a novel method to include the uncertainties or the weather-related input variables in neural network-based electric load forecasting models. The new method consists of traditionally trained neural networks and a set of equations to calculate the mean value and confidence intervals of the forecasted load. This method was tested for daily peak load forecasts for one year by using... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Neural network using longitudinal modes of an injection laser with external feedback

    Publication Year: 1996, Page(s):1389 - 1400
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract |PDF file iconPDF (1216 KB)

    A new optical neural-network concept using the control of the modes of an injection laser by external feedback is described by a simple laser model. This approach uses the wavelength dispersed longitudinal modes of the laser as neurons and the amount of external feedback as connection weights. The predictions of the simple model are confirmed both with extensive numerical examples using the laser ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

IEEE Transactions on Neural Networks is devoted to the science and technology of neural networks, which disclose significant technical knowledge, exploratory developments, and applications of neural networks from biology to software to hardware.

 

This Transactions ceased production in 2011. The current retitled publication is IEEE Transactions on Neural Networks and Learning Systems.

Full Aims & Scope