By Topic

Neural Networks, IEEE Transactions on

Issue 6 • Date Nov. 2000

Filter Results

Displaying Results 1 - 25 of 34
  • Book reviews

    Publication Year: 2000 , Page(s): 1508 - 1511
    Save to Project icon | Request Permissions | PDF file iconPDF (30 KB)  
    Freely Available from IEEE
  • Author index

    Publication Year: 2000 , Page(s): 1512 - 1516
    Save to Project icon | Request Permissions | PDF file iconPDF (52 KB)  
    Freely Available from IEEE
  • Subject index

    Publication Year: 2000 , Page(s): 1516 - 1529
    Save to Project icon | Request Permissions | PDF file iconPDF (103 KB)  
    Freely Available from IEEE
  • On-line learning of dynamical systems in the presence of model mismatch and disturbances

    Publication Year: 2000 , Page(s): 1272 - 1283
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (304 KB)  

    This paper is concerned with the online learning of unknown dynamical systems using a recurrent neural network. The unknown dynamic systems to be learned are subject to disturbances and possibly unstable. The neural-network model used has a simple architecture with one layer of adaptive connection weights. Four learning rules are proposed for the cases where the system state is measurable in conti... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Convergent on-line algorithms for supervised learning in neural networks

    Publication Year: 2000 , Page(s): 1284 - 1299
    Cited by:  Papers (12)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (372 KB)  

    We define online algorithms for neural network training, based on the construction of multiple copies of the network, which are trained by employing different data blocks. It is shown that suitable training algorithms can be defined, in a way that the disagreement between the different copies of the network is asymptotically reduced, and convergence toward stationary points of the global error fun... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • State-based SHOSLIF for indoor visual navigation

    Publication Year: 2000 , Page(s): 1300 - 1314
    Cited by:  Papers (7)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (412 KB)  

    In this paper, we investigate vision-based navigation using the self-organizing hierarchical optimal subspace learning and inference framework (SHOSLIF) that incorporates states and a visual attention mechanism. With states to keep the history information and regarding the incoming video input as an observation vector, the vision-based navigation is formulated as an observation-driven Markov model... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • User adaptive handwriting recognition by self-growing probabilistic decision-based neural networks

    Publication Year: 2000 , Page(s): 1373 - 1384
    Cited by:  Papers (6)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (276 KB)  

    Based on self-growing probabilistic decision-based neural networks (SPDNNs), user adaptation of the parameters of SPDNN is formulated as incremental reinforced and anti-reinforced learning procedures, which are easily integrated into the batched training procedures of the SPDNN. In this study, we developed: 1) an SPDNN based handwriting recognition system; 2) a two-stage recognition structure; and... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Lp approximation of Sigma-Pi neural networks

    Publication Year: 2000 , Page(s): 1485 - 1489
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (184 KB)  

    A feedforward Sigma-Pi neural network with a single hidden layer of m neurons is given by mΣj=1cjg(nΠk=1xkkjkj) where cj, θkj, λk∈R. We investigate the approximation of arbitrary functions f: Rn→R ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A hybrid linear-neural model for time series forecasting

    Publication Year: 2000 , Page(s): 1402 - 1412
    Cited by:  Papers (18)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (268 KB)  

    This paper considers a linear model with time varying parameters controlled by a neural network to analyze and forecast nonlinear time series. We show that this formulation, called neural coefficient smooth transition autoregressive model, is in close relation to the threshold autoregressive model and the smooth transition autoregressive model with the advantage of naturally incorporating linear m... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Variational Gaussian process classifiers

    Publication Year: 2000 , Page(s): 1458 - 1464
    Cited by:  Papers (30)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (180 KB)  

    Gaussian processes are a promising nonlinear regression tool, but it is not straightforward to solve classification problems with them. In the paper the variational methods of Jaakkola and Jordan (2000) are applied to Gaussian processes to produce an efficient Bayesian binary classifier. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Self-stabilized gradient algorithms for blind source separation with orthogonality constraints

    Publication Year: 2000 , Page(s): 1490 - 1497
    Cited by:  Papers (26)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (188 KB)  

    Developments in self-stabilized algorithms for gradient adaptation of orthonormal matrices have resulted in simple but powerful principal and minor subspace analysis methods. We extend these ideas to develop algorithms for instantaneous prewhitened blind separation of homogeneous signal mixtures. Our algorithms are proven to be self-stabilizing to the Stiefel manifold of orthonormal matrices, such... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Global stability for cellular neural networks with time delay

    Publication Year: 2000 , Page(s): 1481 - 1484
    Cited by:  Papers (119)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (92 KB)  

    A sufficient condition related to the existence of a unique equilibrium point and its global asymptotic stability for cellular network networks with delay (DCNNs) is derived. It is shown that the condition relies on the feedback matrices and is independent of the delay parameter. Furthermore, this condition is less restrictive than that given in the literature. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Morphology and autowave metric on CNN applied to bubble-debris classification

    Publication Year: 2000 , Page(s): 1385 - 1393
    Cited by:  Papers (8)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (376 KB)  

    We present the initial results of cellular neural network (CNN)-based autowave metric to high-speed pattern recognition of gray-scale images. The approach is applied to a problem involving separation of metallic wear debris particles from air bubbles. This problem arises in an optical-based system for determination of mechanical wear. This paper focuses on distinguishing debris particles suspended... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Generalization of adaptive neuro-fuzzy inference systems

    Publication Year: 2000 , Page(s): 1332 - 1346
    Cited by:  Papers (36)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (376 KB)  

    The adaptive network-based fuzzy inference systems (ANFIS) of Jang (1993) is extended to the generalized ANFIS (GANFIS) by proposing a generalized fuzzy model (GFM) and considering a generalized radial basis function (GRBF) network. The GFM encompasses both the Takagi-Sugeno (TS)-model and the compositional rule of inference (CRI) model. The conditions by which the proposed GFM converts to TS-mode... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Asynchronous self-organizing maps

    Publication Year: 2000 , Page(s): 1315 - 1322
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (196 KB)  

    A recently defined energy function which leads to a self-organizing map is used as a foundation for an asynchronous neural-network algorithm. We generalize the existing stochastic gradient approach to an asynchronous parallel stochastic gradient method for generating a topological map on a distributed computer system (MIMD). A convergence proof is presented and simulation results on a set of probl... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Blind extraction of singularly mixed source signals

    Publication Year: 2000 , Page(s): 1413 - 1422
    Cited by:  Papers (19)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (240 KB)  

    This paper introduces a novel technique for sequential blind extraction of singularly mixed sources. First, a neural-network model and an adaptive algorithm for single-source blind extraction are introduced. Next, an extractability analysis is presented for singular mixing matrix, and two sets of necessary and sufficient extractability conditions are derived. The adaptive algorithm and neural-netw... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A robust neural controller for underwater robot manipulators

    Publication Year: 2000 , Page(s): 1465 - 1470
    Cited by:  Papers (9)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (156 KB)  

    Presents a robust control scheme using a multilayer neural network with the error backpropagation learning algorithm. The multilayer neural network acts as a compensator of the conventional sliding mode controller to improve the control performance when initial assumptions of uncertainty bounds of system parameters are not valid. The proposed controller is applied to control a robot manipulator op... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Learning parametric specular reflectance model by radial basis function network

    Publication Year: 2000 , Page(s): 1498 - 1503
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (364 KB)  

    For the shape from shading problem, it is known that most real images usually contain specular components and are affected by unknown reflectivity. In the paper, these limitations are addressed and a neural-based specular reflectance model is proposed. The idea of this method is to optimize a proper specular model by learning the parameters of a radial basis function network and to recover the obj... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Neural discriminant analysis

    Publication Year: 2000 , Page(s): 1394 - 1401
    Cited by:  Papers (6)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (160 KB)  

    The role of bootstrap is highlighted for nonlinear discriminant analysis using a feedforward neural network model. Statistical techniques are formulated in terms of the principle of the likelihood of a neural-network model when the data consist of ungrouped binary responses and a set of predictor variables. We illustrate that the information criterion based on the bootstrap method is shown to be f... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Elementary function generators for neural-network emulators

    Publication Year: 2000 , Page(s): 1438 - 1449
    Cited by:  Papers (10)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (356 KB)  

    Piecewise first- and second-order approximations are employed to design commonly used elementary function generators for neural-network emulators. Three novel schemes are proposed for the first-order approximations. The first scheme requires one multiplication, one addition, and a 28-byte lookup table. The second scheme requires one addition, a 14-byte lookup table, and no multiplication. The thir... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Synthesis of feedforward networks in supremum error bound

    Publication Year: 2000 , Page(s): 1213 - 1227
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (500 KB)  

    The main result of this paper is a constructive proof of a formula for the upper bound of the approximation error in L (supremum norm) of multidimensional functions by feedforward networks with one hidden layer of sigmoidal units and a linear output. This result is applied to formulate a new method of neural-network synthesis. The result can also be used to estimate complexity of... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A comment on "On equilibria, stability, and instability of Hopfield neural networks" [and reply]

    Publication Year: 2000 , Page(s): 1506 - 1507
    Cited by:  Papers (9)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (76 KB)  

    It is pointed out that the main analysis results about the existence, uniqueness, and global asymptotic stability of the equilibrium of a continuous-time Hopfield type neural network given in the paper by Zhi-Hong Guan et al. (2000) are special cases of relevant ones previously obtained in the literature. In reply the original authors consider the reasoning of Xue-Bin Liang's comments and state th... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An iterative inversion approach to blind source separation

    Publication Year: 2000 , Page(s): 1423 - 1437
    Cited by:  Papers (24)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (328 KB)  

    We present an iterative inversion (II) approach to blind source separation (BSS). It consists of a quasi-Newton method for the resolution of an estimating equation obtained from the implicit inversion of a robust estimate of the mixing system. The resulting learning rule includes several existing algorithms for BSS as particular cases giving them a novel and unified interpretation. It also provide... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Voronoi networks and their probability of misclassification

    Publication Year: 2000 , Page(s): 1361 - 1372
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (300 KB)  

    To reduce the memory requirements and the computation cost, many algorithms have been developed that perform nearest neighbor classification using only a small number of representative samples obtained from the training set. We call the classification model underlying all these algorithms as Voronoi networks (Vnets). We analyze the generalization capabilities of these networks by bounding the gene... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Building cost functions minimizing to some summary statistics

    Publication Year: 2000 , Page(s): 1263 - 1271
    Cited by:  Papers (10)
    Save to Project icon | Request Permissions | Click to expandAbstract | PDF file iconPDF (180 KB)  

    A learning machine-or a model-is usually trained by minimizing a given criterion (the expectation of the cost function), measuring the discrepancy between the model output and the desired output. As is already well known, the choice of the cost function has a profound impact on the probabilistic interpretation of the output of the model, after training. In this work, we use the calculus of variati... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

IEEE Transactions on Neural Networks is devoted to the science and technology of neural networks, which disclose significant technical knowledge, exploratory developments, and applications of neural networks from biology to software to hardware.

 

This Transactions ceased production in 2011. The current retitled publication is IEEE Transactions on Neural Networks and Learning Systems.

Full Aims & Scope