IEEE Transactions on Neural Networks

Volume 7 Issue 6 • Nov. 1996

Filter Results

Displaying Results 1 - 25 of 32
  • Corrections to "Image Recovery and Segmentation Using Competitive Learning in a Layered Network

    Publication Year: 1996
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (109 KB)

    First Page of the Article
    View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Self-Organizing Maps [Book Reviews]

    Publication Year: 1996
    Request permission for commercial reuse | PDF file iconPDF (268 KB)
    Freely Available from IEEE
  • Neural Adaptive Control Technology [Books in Brief]

    Publication Year: 1996
    Request permission for commercial reuse | PDF file iconPDF (128 KB)
    Freely Available from IEEE
  • Mathematical Perspectives on Neural Networks [Books in Brief]

    Publication Year: 1996
    Request permission for commercial reuse | PDF file iconPDF (128 KB)
    Freely Available from IEEE
  • 1996 Index IEEE Transactions on Neural Networks Vol. 7

    Publication Year: 1996
    Request permission for commercial reuse | PDF file iconPDF (2307 KB)
    Freely Available from IEEE
  • Computational capabilities of local-feedback recurrent networks acting as finite-state machines

    Publication Year: 1996, Page(s):1521 - 1525
    Cited by:  Papers (17)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (480 KB)

    In this paper we explore the expressive power of recurrent networks with local feedback connections for symbolic data streams. We rely on the analysis of the maximal set of strings that can be shattered by the concept class associated to these networks (i.e. strings that can be arbitrarily classified as positive or negative), and find that their expressive power is inherently limited, since there ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Learning capacity and sample complexity on expert networks

    Publication Year: 1996, Page(s):1517 - 1520
    Cited by:  Papers (11)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (380 KB)

    A major development in knowledge-based neural networks is the integration of symbolic expert rule-based knowledge into neural networks, resulting in so-called rule-based neural (or connectionist) networks. An expert network here refers to a particular construct in which the uncertainty management model of symbolic expert systems is mapped into the activation function of the neural network. This pa... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Improving convergence and solution quality of Hopfield-type neural networks with augmented Lagrange multipliers

    Publication Year: 1996, Page(s):1507 - 1516
    Cited by:  Papers (44)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (704 KB)

    Hopfield-type networks convert a combinatorial optimization to a constrained real optimization and solve the latter using the penalty method. There is a dilemma with such networks: when tuned to produce good-quality solutions, they can fail to converge to valid solutions; and when tuned to converge, they tend to give low-quality solutions. This paper proposes a new method, called the augmented Lag... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Neural network using longitudinal modes of an injection laser with external feedback

    Publication Year: 1996, Page(s):1389 - 1400
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1216 KB)

    A new optical neural-network concept using the control of the modes of an injection laser by external feedback is described by a simple laser model. This approach uses the wavelength dispersed longitudinal modes of the laser as neurons and the amount of external feedback as connection weights. The predictions of the simple model are confirmed both with extensive numerical examples using the laser ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Ultimate performance of QEM classifiers

    Publication Year: 1996, Page(s):1535 - 1537
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (272 KB)

    Supervised learning of classifiers often resorts to the minimization of a quadratic error, even if this criterion is more especially matched to nonlinear regression problems. It is shown that the mapping built by a quadratic error minimization (QEM) tends to output the Bayesian discriminating rules even with nonuniform losses, provided the desired responses are chosen accordingly. This property is... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Adaptive associative memories capable of pattern segmentation

    Publication Year: 1996, Page(s):1439 - 1449
    Cited by:  Papers (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (952 KB)

    This paper presents an adaptive type of associative memory (AAM) that can separate patterns from composite inputs which might be degraded by deficiency or noise and that can recover incomplete or noisy single patterns. The behavior of AAM is analyzed in terms of stability, giving the stable solutions (results of recall), and the recall of spurious memories (the undesired solutions) is shown to be ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Error-minimizing dead zone for basis function networks

    Publication Year: 1996, Page(s):1503 - 1506
    Cited by:  Papers (17)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (340 KB)

    The incorporation of dead zones in the error signal of basis function networks avoids the networks' overtraining and guarantees the convergence of the normalized least mean square (LMS) algorithm and related algorithms. A new so-called error-minimizing dead zone is presented providing the least a posteriori error out of the set of all convergence assuring dead zones. A general convergence proof is... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An argument for abandoning the travelling salesman problem as a neural-network benchmark

    Publication Year: 1996, Page(s):1542 - 1544
    Cited by:  Papers (19)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (348 KB)

    In this paper, a distinction is drawn between research which assesses the suitability of the Hopfield network for solving the travelling salesman problem (TSP) and research which attempts to determine the effectiveness of the Hopfield network as an optimization technique. It is argued that the TSP is generally misused as a benchmark for the latter goal, with the existence of an alternative linear ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Oscillatory and chaotic dynamics in neural networks under varying operating conditions

    Publication Year: 1996, Page(s):1382 - 1388
    Cited by:  Papers (38)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (2220 KB)

    This paper studies the effects of a time-dependent operating environment on the dynamics of a neural network. In the previous paper Wang et al. (1990) studied an exactly solvable model of a higher order neural network. We identified a bifurcation parameter for the system, i.e., the rescaled noise level, which represents the combined effects of incomplete connectivity, interference among stored pat... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A note on stability of analog neural networks with time delays

    Publication Year: 1996, Page(s):1533 - 1535
    Cited by:  Papers (89)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (284 KB)

    This note presents a generalized sufficient condition which guarantees stability of analog neural networks with time delays. The condition is derived using a Lyapunov functional and the stability criterion is stated as: the equilibrium of analog neural networks with delays is globally asymptotically stable if the product of the norm of connection matrix and the maximum neuronal gain is less than o... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An analysis of noise in recurrent neural networks: convergence and generalization

    Publication Year: 1996, Page(s):1424 - 1438
    Cited by:  Papers (40)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1232 KB)

    Concerns the effect of noise on the performance of feedforward neural nets. We introduce and analyze various methods of injecting synaptic noise into dynamically driven recurrent nets during training. Theoretical results show that applying a controlled amount of noise during training may improve convergence and generalization performance. We analyze the effects of various noise parameters and pred... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Median radial basis function neural network

    Publication Year: 1996, Page(s):1351 - 1364
    Cited by:  Papers (53)  |  Patents (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1388 KB)

    Radial basis functions (RBFs) consist of a two-layer neural network, where each hidden unit implements a kernel function. Each kernel is associated with an activation region from the input space and its output is fed to an output unit. In order to find the parameters of a neural network which embeds this structure we take into consideration two different statistical approaches. The first approach ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Incremental learning of complex temporal patterns

    Publication Year: 1996, Page(s):1465 - 1481
    Cited by:  Papers (52)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1516 KB)

    A neural model for temporal pattern generation is used and analyzed for training with multiple complex sequences in a sequential manner. The network exhibits some degree of interference when new sequences are acquired. It is proven that the model is capable of incrementally learning a finite number of complex sequences. The model is then evaluated with a large set of highly correlated sequences. W... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Exploring and comparing the best “direct methods” for the efficient training of MLP-networks

    Publication Year: 1996, Page(s):1497 - 1502
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (524 KB)

    It is well known that the main difficulties of the algorithms based on backpropagation are the susceptibility to local minima and the slow adaptivity to the patterns during the training. In this paper, we present a class of algorithms, which overcome the above difficulties by utilizing some “direct” numerical methods for the computation of the matrices of weights. In particular, we inv... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Genetic evolution of radial basis function coverage using orthogonal niches

    Publication Year: 1996, Page(s):1525 - 1528
    Cited by:  Papers (24)  |  Patents (9)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (448 KB)

    A well-performing set of radial basis functions (RBFs) can emerge from genetic competition among individual RBFs. Genetic selection of the individual RBFs is based on credit sharing which localizes competition within orthogonal niches. These orthogonal niches are derived using singular value decomposition and are used to apportion credit for the overall performance of the RBF network among individ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On adaptive trajectory tracking of a robot manipulator using inversion of its neural emulator

    Publication Year: 1996, Page(s):1401 - 1414
    Cited by:  Papers (32)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (996 KB)

    This paper is concerned with the design of a neuro-adaptive trajectory tracking controller. The paper presents a new control scheme based on inversion of a feedforward neural model of a robot arm. The proposed control scheme requires two modules. The first module consists of an appropriate feedforward neural model of forward dynamics of the robot arm that continuously accounts for the changes in t... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An asynchronous distributed architecture model for the Boltzmann machine control mechanism

    Publication Year: 1996, Page(s):1538 - 1541
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (452 KB)

    We present a study addressing a hardware implementation of the Boltzmann machine that relies on the concept of asynchronous digital system. The constraint of concurrently switching only unconnected neurons is dynamically satisfied by using an asynchronous distributed control mechanism. The design of the control architecture is derived from a formal definition of the problem by means of the trace t... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Learning long-term dependencies in NARX recurrent neural networks

    Publication Year: 1996, Page(s):1329 - 1338
    Cited by:  Papers (152)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (824 KB)

    It has previously been shown that gradient-descent learning algorithms for recurrent neural networks can perform poorly on tasks that involve long-term dependencies, i.e. those problems for which the desired output depends on inputs presented at times far in the past. We show that the long-term dependencies problem is lessened for a class of architectures called nonlinear autoregressive models wit... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A generalized learning paradigm exploiting the structure of feedforward neural networks

    Publication Year: 1996, Page(s):1450 - 1460
    Cited by:  Papers (54)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (988 KB)

    In this paper a general class of fast learning algorithms for feedforward neural networks is introduced and described. The approach exploits the separability of each layer into linear and nonlinear blocks and consists of two steps. The first step is the descent of the error functional in the space of the outputs of the linear blocks (descent in the neuron space), which can be performed using any p... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A new neural network for solving linear and quadratic programming problems

    Publication Year: 1996, Page(s):1544 - 1548
    Cited by:  Papers (147)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (336 KB)

    A new neural network for solving linear and quadratic programming problems is presented and is shown to be globally convergent. The new neural network improves existing neural networks for solving these problems: it avoids the parameter turning problem, it is capable of achieving the exact solutions, and it uses only simple hardware in which no analog multipliers for variables are required. Furthe... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

IEEE Transactions on Neural Networks is devoted to the science and technology of neural networks, which disclose significant technical knowledge, exploratory developments, and applications of neural networks from biology to software to hardware.

 

This Transactions ceased production in 2011. The current retitled publication is IEEE Transactions on Neural Networks and Learning Systems.

Full Aims & Scope