By Topic

# IEEE Transactions on Neural Networks

## Filter Results

Displaying Results 1 - 25 of 36

Publication Year: 2007, Page(s):C1 - C4
| PDF (39 KB)
• ### IEEE Transactions on Neural Networks publication information

Publication Year: 2007, Page(s): C2
| PDF (36 KB)
• ### Global Convergence of GHA Learning Algorithm With Nonzero-Approaching Adaptive Learning Rates

Publication Year: 2007, Page(s):1557 - 1571
Cited by:  Papers (13)
| | PDF (2253 KB) | HTML

The generalized Hebbian algorithm (GHA) is one of the most widely used principal component analysis (PCA) neural network (NN) learning algorithms. Learning rates of GHA play important roles in convergence of the algorithm for applications. Traditionally, the learning rates of GHA are required to converge to zero so that its convergence can be analyzed by studying the corresponding deterministic co... View full abstract»

• ### Semiparametric Regression Using Student $t$ Processes

Publication Year: 2007, Page(s):1572 - 1588
Cited by:  Papers (2)
| | PDF (1112 KB) | HTML

In this paper, we propose a latent factor regression model, in which priors are assigned to both the latent regression vector and the error term, by using reproducing kernels. The resulting regression function follows a stochastic process known as a student process. The model is attractive because its implementation is based on a tractable posterior predictive distribution and a simple expectation... View full abstract»

• ### On the Convergence of Multiplicative Update Algorithms for Nonnegative Matrix Factorization

Publication Year: 2007, Page(s):1589 - 1596
Cited by:  Papers (128)  |  Patents (3)
| | PDF (263 KB) | HTML

Nonnegative matrix factorization (NMF) is useful to find basis information of nonnegative data. Currently, multiplicative updates are a simple and popular way to find the factorization. However, for the common NMF approach of minimizing the Euclidean distance between approximate and true values, no proof has shown that multiplicative updates converge to a stationary point of the NMF optimization p... View full abstract»

• ### An Adaptable Connectionist Text-Retrieval System With Relevance Feedback

Publication Year: 2007, Page(s):1597 - 1613
Cited by:  Papers (2)
| | PDF (792 KB) | HTML

This paper introduces a new connectionist network for certain domain-specific text-retrieval and search applications with expert end users. A new model reference adaptive system is proposed that involves three learning phases. Initial model-reference learning is first performed based upon an ensemble set of input-output of an initial reference model. Model-reference following is needed in dynamic ... View full abstract»

• ### Neural Network Approach to Background Modeling for Video Object Segmentation

Publication Year: 2007, Page(s):1614 - 1627
Cited by:  Papers (72)  |  Patents (16)
| | PDF (2156 KB) | HTML

This paper presents a novel background modeling and subtraction approach for video object segmentation. A neural network (NN) architecture is proposed to form an unsupervised Bayesian classifier for this application domain. The constructed classifier efficiently handles the segmentation in natural-scene sequences with complex background motion and changes in illumination. The weights of the propos... View full abstract»

• ### The Bayesian ARTMAP

Publication Year: 2007, Page(s):1628 - 1644
Cited by:  Papers (27)
| | PDF (684 KB) | HTML

In this paper, we modify the fuzzy ARTMAP (FA) neural network (NN) using the Bayesian framework in order to improve its classification accuracy while simultaneously reduce its category proliferation. The proposed algorithm, called Bayesian ARTMAP (BA), preserves the FA advantages and also enhances its performance by the following: (1) representing a category using a multidimensional Gaussian distr... View full abstract»

• ### The Hierarchical Fast Learning Artificial Neural Network (HieFLANN)—An Autonomous Platform for Hierarchical Neural Network Construction

Publication Year: 2007, Page(s):1645 - 1657
Cited by:  Papers (15)
| | PDF (1136 KB) | HTML

The hierarchical fast learning artificial neural network (HieFLANN) is a clustering NN that can be initialized using statistical properties of the data set. This provides the possibility of constructing the entire network autonomously with no manual intervention. This distinguishes it from many existing networks that, though hierarchically plausible, still require manual initialization processes. ... View full abstract»

• ### Hierarchically Clustered Adaptive Quantization CMAC and Its Learning Convergence

Publication Year: 2007, Page(s):1658 - 1682
Cited by:  Papers (17)
| | PDF (2787 KB) | HTML

The cerebellar model articulation controller (CMAC) neural network (NN) is a well-established computational model of the human cerebellum. Nevertheless, there are two major drawbacks associated with the uniform quantization scheme of the CMAC network. They are the following: (1) a constant output resolution associated with the entire input space and (2) the generalization-accuracy dilemma. Moreove... View full abstract»

• ### Density-Driven Generalized Regression Neural Networks (DD-GRNN) for Function Approximation

Publication Year: 2007, Page(s):1683 - 1696
Cited by:  Papers (16)
| | PDF (2408 KB) | HTML

This paper proposes a new nonparametric regression method, based on the combination of generalized regression neural networks (GRNNs), density-dependent multiple kernel bandwidths, and regularization. The presented model is generic and substitutes the very large number of bandwidths with a much smaller number of trainable weights that control the regression model. It depends on sets of extracted d... View full abstract»

• ### Solving Generally Constrained Generalized Linear Variational Inequalities Using the General Projection Neural Networks

Publication Year: 2007, Page(s):1697 - 1708
Cited by:  Papers (14)
| | PDF (792 KB) | HTML

Generalized linear variational inequality (GLVI) is an extension of the canonical linear variational inequality. In recent years, a recurrent neural network (NN) called general projection neural network (GPNN) was developed for solving GLVIs with simple bound (often box-type or sphere-type) constraints. The aim of this paper is twofold. First, some further stability results of the GPNN are present... View full abstract»

• ### Locally Weighted Online Approximation-Based Control for Nonaffine Systems

Publication Year: 2007, Page(s):1709 - 1724
Cited by:  Papers (8)
| | PDF (753 KB) | HTML

This paper is concerned with tracking control problems for nonlinear systems that are not affine in the control signal and that contain unknown nonlinearities in the system dynamic equations. This paper develops a piecewise linear approximation to the unknown functions during the system operation. New control and parameter adaptation algorithms are designed and analyzed using Lyapunov-like methods... View full abstract»

• ### Fixed-Final-Time-Constrained Optimal Control of Nonlinear Systems Using Neural Network HJB Approach

Publication Year: 2007, Page(s):1725 - 1737
Cited by:  Papers (39)
| | PDF (673 KB) | HTML

In this paper, fixed-final time-constrained optimal control laws using neural networks (NNS) to solve Hamilton-Jacobi-Bellman (HJB) equations for general affine in the constrained nonlinear systems are proposed. An NN is used to approximate the time-varying cost function using the method of least squares on a predefined region. The result is an NN nearly -constrained feedback controller that has t... View full abstract»

• ### Reduced Pattern Training Based on Task Decomposition Using Pattern Distributor

Publication Year: 2007, Page(s):1738 - 1749
Cited by:  Papers (9)
| | PDF (503 KB) | HTML

Task decomposition with pattern distributor (PD) is a new task decomposition method for multilayered feedforward neural networks (NNs). Pattern distributor network is proposed that implements this new task decomposition method. We propose a theoretical model to analyze the performance of pattern distributor network. A method named reduced pattern training (RPT) is also introduced, aiming to improv... View full abstract»

• ### Block-Based Neural Networks for Personalized ECG Signal Classification

Publication Year: 2007, Page(s):1750 - 1761
Cited by:  Papers (92)
| | PDF (1001 KB) | HTML

This paper presents evolvable block-based neural networks (BbNNs) for personalized ECG heartbeat pattern classification. A BbNN consists of a 2-D array of modular component NNs with flexible structures and internal configurations that can be implemented using reconfigurable digital hardware such as field-programmable gate arrays (FPGAs). Signal flow between the blocks determines the internal confi... View full abstract»

• ### Compact Modeling of Data Using Independent Variable Group Analysis

Publication Year: 2007, Page(s):1762 - 1776
Cited by:  Papers (3)
| | PDF (863 KB) | HTML

In this paper, we introduce a modeling approach called independent variable group analysis (IVGA) which can be used for finding an efficient structural representation for a given data set. The basic idea is to determine such a grouping for the variables of the data set that mutually dependent variables are grouped together whereas mutually independent or weakly dependent variables end up in separa... View full abstract»

• ### Dynamics of Generalized PCA and MCA Learning Algorithms

Publication Year: 2007, Page(s):1777 - 1784
Cited by:  Papers (14)
| | PDF (1680 KB) | HTML

Principal component analysis (PCA) and minor component analysis (MCA) are two important statistical tools which have many applications in the fields of signal processing and data analysis. PCA and MCA neural networks (NNs) can be used to online extract principal component and minor component from input data. It is interesting to develop generalized learning algorithms of PCA and MCA NNs. Some nove... View full abstract»

• ### A Biologically Inspired Spiking Neural Network for Sound Source Lateralization

Publication Year: 2007, Page(s):1785 - 1799
Cited by:  Papers (8)  |  Patents (6)
| | PDF (2106 KB) | HTML

In this paper, a binaural sound source lateralization spiking neural network (NN) will be presented which is inspired by most recent neurophysiological studies on the role of certain nuclei in the superior olivary complex (SOC) and the inferior colliculus (IC). The binaural sound source lateralization neural network (BiSoLaNN) is a spiking NN based on neural mechanisms, utilizing complex neural mo... View full abstract»

• ### Quarterly Time-Series Forecasting With Neural Networks

Publication Year: 2007, Page(s):1800 - 1814
Cited by:  Papers (58)
| | PDF (995 KB) | HTML

Forecasting of time series that have seasonal and other variations remains an important problem for forecasters. This paper presents a neural network (NN) approach to forecasting quarterly time series. With a large data set of 756 quarterly time series from the M3 forecasting competition, we conduct a comprehensive investigation of the effectiveness of several data preprocessing and modeling appro... View full abstract»

• ### Synchrony in Silicon: The Gamma Rhythm

Publication Year: 2007, Page(s):1815 - 1825
Cited by:  Papers (42)
| | PDF (1592 KB) | HTML

In this paper, we present a network of silicon in-terneurons that synchronize in the gamma frequency range (20-80 Hz). The gamma rhythm strongly influences neuronal spike timing within many brain regions, potentially playing a crucial role in computation. Yet it has largely been ignored in neuromorphic systems, which use mixed analog and digital circuits to model neurobiology in silicon. Our neuro... View full abstract»

• ### A Fast Tracking Algorithm for Generalized LARS/LASSO

Publication Year: 2007, Page(s):1826 - 1830
Cited by:  Papers (5)  |  Patents (1)
| | PDF (252 KB) | HTML

This letter gives an efficient algorithm for tracking the solution curve of sparse logistic regression with respect to the regularization parameter. The algorithm is based on approximating the logistic regression loss by a piecewise quadratic function, using Rosset and Zhu's path tracking algorithm on the approximate problem, and then applying a correction to get to the true path. Application of t... View full abstract»

• ### NN-Based Adaptive Tracking Control of Uncertain Nonlinear Systems Disturbed by Unknown Covariance Noise

Publication Year: 2007, Page(s):1830 - 1835
Cited by:  Papers (49)
| | PDF (470 KB) | HTML

A class of uncertain nonlinear systems that are additionally driven by unknown covariance noise is considered. Based on the backstepping technique, adaptive neural control schemes are developed to solve the output tracking control problem of such systems. As it is proven by stability analysis, the proposed controller guarantees that all the error variables are bounded with desired probability in a... View full abstract»

• ### Global $mu$ -Stability of Delayed Neural Networks With Unbounded Time-Varying Delays

Publication Year: 2007, Page(s):1836 - 1840
Cited by:  Papers (50)
| | PDF (220 KB) | HTML

In this letter, dynamical systems with unbounded time-varying delays are investigated. We address the following question: To what extent the time-varying delays can exist while the system is stable? Moreover, a new concept of stability, global mu-stability, is proposed. Under mild conditions, we prove that the dynamical systems with unbounded time-varying delays are globally mu-stable. View full abstract»

• ### Adaptive Synchronization Between Two Different Chaotic Neural Networks With Time Delay

Publication Year: 2007, Page(s):1841 - 1845
Cited by:  Papers (103)
| | PDF (249 KB) | HTML

This letter presents an adaptive synchronization scheme between two different kinds of delayed chaotic neural networks (NNs) with partly unknown parameters. An adaptive controller is designed to guarantee the global asymptotic synchronization of state trajectories for two different chaotic NNs with time delay. An illustrative example is given to demonstrate the effectiveness of the present method. View full abstract»

## Aims & Scope

IEEE Transactions on Neural Networks is devoted to the science and technology of neural networks, which disclose significant technical knowledge, exploratory developments, and applications of neural networks from biology to software to hardware.

This Transactions ceased production in 2011. The current retitled publication is IEEE Transactions on Neural Networks and Learning Systems.

Full Aims & Scope