By Topic

IEEE Transactions on Neural Networks

Issue 6 • Date Nov. 1996

Filter Results

Displaying Results 1 - 25 of 32
  • Corrections to "Image Recovery and Segmentation Using Competitive Learning in a Layered Network

    Publication Year: 1996
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (109 KB)

    First Page of the Article
    View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Self-Organizing Maps [Book Reviews]

    Publication Year: 1996
    Request permission for commercial reuse | PDF file iconPDF (268 KB)
    Freely Available from IEEE
  • Neural Adaptive Control Technology [Books in Brief]

    Publication Year: 1996
    Request permission for commercial reuse | PDF file iconPDF (128 KB)
    Freely Available from IEEE
  • Mathematical Perspectives on Neural Networks [Books in Brief]

    Publication Year: 1996
    Request permission for commercial reuse | PDF file iconPDF (128 KB)
    Freely Available from IEEE
  • 1996 Index IEEE Transactions on Neural Networks Vol. 7

    Publication Year: 1996
    Request permission for commercial reuse | PDF file iconPDF (2307 KB)
    Freely Available from IEEE
  • Complex-valued multistate neural associative memory

    Publication Year: 1996, Page(s):1491 - 1496
    Cited by:  Papers (142)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (488 KB)

    A model of a multivalued associative memory is presented. This memory has the form of a fully connected attractor neural network composed of multistate complex-valued neurons. Such a network is able to perform the task of storing and recalling gray-scale images. It is also shown that the complex-valued fully connected neural network may be considered as a generalization of a Hopfield network conta... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Effect of probabilistic inputs on neural network-based electric load forecasting

    Publication Year: 1996, Page(s):1528 - 1532
    Cited by:  Papers (25)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (416 KB)

    This paper presents a novel method to include the uncertainties or the weather-related input variables in neural network-based electric load forecasting models. The new method consists of traditionally trained neural networks and a set of equations to calculate the mean value and confidence intervals of the forecasted load. This method was tested for daily peak load forecasts for one year by using... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The self-organizing field [Kohonen maps]

    Publication Year: 1996, Page(s):1415 - 1423
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (792 KB)

    Many of the properties of the well-known Kohonen map algorithm are not easily derivable from its discrete formulation. For instance, the “projection” implemented by the map from a high dimensional input space to a lower dimensional map space must be properly regarded as a projection from a smooth manifold to a lattice and, in this framework, some of its properties are not easily identi... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Error-minimizing dead zone for basis function networks

    Publication Year: 1996, Page(s):1503 - 1506
    Cited by:  Papers (15)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (340 KB)

    The incorporation of dead zones in the error signal of basis function networks avoids the networks' overtraining and guarantees the convergence of the normalized least mean square (LMS) algorithm and related algorithms. A new so-called error-minimizing dead zone is presented providing the least a posteriori error out of the set of all convergence assuring dead zones. A general convergence proof is... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A neural network-based model for paper currency recognition and verification

    Publication Year: 1996, Page(s):1482 - 1490
    Cited by:  Papers (46)  |  Patents (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1056 KB)

    This paper describes the neural-based recognition and verification techniques used in a banknote machine, recently implemented for accepting paper currency of different countries. The perception mechanism is based on low-cost optoelectronic devices which produce a signal associated with the light refracted by the banknotes. The classification and verification steps are carried out by a society of ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An argument for abandoning the travelling salesman problem as a neural-network benchmark

    Publication Year: 1996, Page(s):1542 - 1544
    Cited by:  Papers (18)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (348 KB)

    In this paper, a distinction is drawn between research which assesses the suitability of the Hopfield network for solving the travelling salesman problem (TSP) and research which attempts to determine the effectiveness of the Hopfield network as an optimization technique. It is argued that the TSP is generally misused as a benchmark for the latter goal, with the existence of an alternative linear ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Multiplication-free radial basis function network

    Publication Year: 1996, Page(s):1461 - 1464
    Cited by:  Papers (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (400 KB)

    For the purpose of adaptive function approximation, a new radial basis function network is proposed which is nonlinear in its parameters. The goal is to reduce significantly the computational effort for a serial processor, by avoiding multiplication in both the evaluation of the function model and the computation of the parameter adaptation. The approximation scheme makes use of a grid-based Gauss... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Median radial basis function neural network

    Publication Year: 1996, Page(s):1351 - 1364
    Cited by:  Papers (48)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1388 KB)

    Radial basis functions (RBFs) consist of a two-layer neural network, where each hidden unit implements a kernel function. Each kernel is associated with an activation region from the input space and its output is fed to an output unit. In order to find the parameters of a neural network which embeds this structure we take into consideration two different statistical approaches. The first approach ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Genetic evolution of radial basis function coverage using orthogonal niches

    Publication Year: 1996, Page(s):1525 - 1528
    Cited by:  Papers (24)  |  Patents (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (448 KB)

    A well-performing set of radial basis functions (RBFs) can emerge from genetic competition among individual RBFs. Genetic selection of the individual RBFs is based on credit sharing which localizes competition within orthogonal niches. These orthogonal niches are derived using singular value decomposition and are used to apportion credit for the overall performance of the RBF network among individ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On adaptive trajectory tracking of a robot manipulator using inversion of its neural emulator

    Publication Year: 1996, Page(s):1401 - 1414
    Cited by:  Papers (30)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (996 KB)

    This paper is concerned with the design of a neuro-adaptive trajectory tracking controller. The paper presents a new control scheme based on inversion of a feedforward neural model of a robot arm. The proposed control scheme requires two modules. The first module consists of an appropriate feedforward neural model of forward dynamics of the robot arm that continuously accounts for the changes in t... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Exploring and comparing the best “direct methods” for the efficient training of MLP-networks

    Publication Year: 1996, Page(s):1497 - 1502
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (524 KB)

    It is well known that the main difficulties of the algorithms based on backpropagation are the susceptibility to local minima and the slow adaptivity to the patterns during the training. In this paper, we present a class of algorithms, which overcome the above difficulties by utilizing some “direct” numerical methods for the computation of the matrices of weights. In particular, we inv... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A note on stability of analog neural networks with time delays

    Publication Year: 1996, Page(s):1533 - 1535
    Cited by:  Papers (88)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (284 KB)

    This note presents a generalized sufficient condition which guarantees stability of analog neural networks with time delays. The condition is derived using a Lyapunov functional and the stability criterion is stated as: the equilibrium of analog neural networks with delays is globally asymptotically stable if the product of the norm of connection matrix and the maximum neuronal gain is less than o... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An analysis of noise in recurrent neural networks: convergence and generalization

    Publication Year: 1996, Page(s):1424 - 1438
    Cited by:  Papers (36)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1232 KB)

    Concerns the effect of noise on the performance of feedforward neural nets. We introduce and analyze various methods of injecting synaptic noise into dynamically driven recurrent nets during training. Theoretical results show that applying a controlled amount of noise during training may improve convergence and generalization performance. We analyze the effects of various noise parameters and pred... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Improving convergence and solution quality of Hopfield-type neural networks with augmented Lagrange multipliers

    Publication Year: 1996, Page(s):1507 - 1516
    Cited by:  Papers (42)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (704 KB)

    Hopfield-type networks convert a combinatorial optimization to a constrained real optimization and solve the latter using the penalty method. There is a dilemma with such networks: when tuned to produce good-quality solutions, they can fail to converge to valid solutions; and when tuned to converge, they tend to give low-quality solutions. This paper proposes a new method, called the augmented Lag... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A new neural network for solving linear and quadratic programming problems

    Publication Year: 1996, Page(s):1544 - 1548
    Cited by:  Papers (134)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (336 KB)

    A new neural network for solving linear and quadratic programming problems is presented and is shown to be globally convergent. The new neural network improves existing neural networks for solving these problems: it avoids the parameter turning problem, it is capable of achieving the exact solutions, and it uses only simple hardware in which no analog multipliers for variables are required. Furthe... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Incremental learning of complex temporal patterns

    Publication Year: 1996, Page(s):1465 - 1481
    Cited by:  Papers (51)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1516 KB)

    A neural model for temporal pattern generation is used and analyzed for training with multiple complex sequences in a sequential manner. The network exhibits some degree of interference when new sequences are acquired. It is proven that the model is capable of incrementally learning a finite number of complex sequences. The model is then evaluated with a large set of highly correlated sequences. W... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Lagrangian relaxation network for graph matching

    Publication Year: 1996, Page(s):1365 - 1381
    Cited by:  Papers (24)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (1320 KB)

    A Lagrangian relaxation network for graph matching is presented. The problem is formulated as follows: given graphs G and g, find a permutation matrix M that brings the two sets of vertices into correspondence. Permutation matrix constraints are formulated in the framework of deterministic annealing. Our approach is in the same spirit as a Lagrangian decomposition approach in that the row and colu... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Ultimate performance of QEM classifiers

    Publication Year: 1996, Page(s):1535 - 1537
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (272 KB)

    Supervised learning of classifiers often resorts to the minimization of a quadratic error, even if this criterion is more especially matched to nonlinear regression problems. It is shown that the mapping built by a quadratic error minimization (QEM) tends to output the Bayesian discriminating rules even with nonuniform losses, provided the desired responses are chosen accordingly. This property is... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Adaptive associative memories capable of pattern segmentation

    Publication Year: 1996, Page(s):1439 - 1449
    Cited by:  Papers (7)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (952 KB)

    This paper presents an adaptive type of associative memory (AAM) that can separate patterns from composite inputs which might be degraded by deficiency or noise and that can recover incomplete or noisy single patterns. The behavior of AAM is analyzed in terms of stability, giving the stable solutions (results of recall), and the recall of spurious memories (the undesired solutions) is shown to be ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Learning long-term dependencies in NARX recurrent neural networks

    Publication Year: 1996, Page(s):1329 - 1338
    Cited by:  Papers (123)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (824 KB)

    It has previously been shown that gradient-descent learning algorithms for recurrent neural networks can perform poorly on tasks that involve long-term dependencies, i.e. those problems for which the desired output depends on inputs presented at times far in the past. We show that the long-term dependencies problem is lessened for a class of architectures called nonlinear autoregressive models wit... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

IEEE Transactions on Neural Networks is devoted to the science and technology of neural networks, which disclose significant technical knowledge, exploratory developments, and applications of neural networks from biology to software to hardware.

 

This Transactions ceased production in 2011. The current retitled publication is IEEE Transactions on Neural Networks and Learning Systems.

Full Aims & Scope