By Topic

# IEEE Transactions on Neural Networks

## Filter Results

Displaying Results 1 - 22 of 22

Publication Year: 2009, Page(s): C1
| PDF (39 KB)
• ### IEEE Transactions on Neural Networks publication information

Publication Year: 2009, Page(s): C2
| PDF (38 KB)
• ### Artificial Neural Network Method for Solution of Boundary Value Problems With Exact Satisfaction of Arbitrary Boundary Conditions

Publication Year: 2009, Page(s):1221 - 1233
Cited by:  Papers (27)
| | PDF (3898 KB) | HTML

A method for solving boundary value problems (BVPs) is introduced using artificial neural networks (ANNs) for irregular domain boundaries with mixed Dirichlet/Neumann boundary conditions (BCs). The approximate ANN solution automatically satisfies BCs at all stages of training, including before training commences. This method is simpler than other ANN methods for solving BVPs due to its unconstrain... View full abstract»

• ### A $Q$ -Learning Approach to Derive Optimal Consumption and Investment Strategies

Publication Year: 2009, Page(s):1234 - 1243
Cited by:  Papers (5)
| | PDF (1366 KB) | HTML

In this paper, we consider optimal consumption and strategic asset allocation decisions of an investor with a finite planning horizon. A Q-learning approach is used to maximize the expected utility of consumption. The first part of the paper presents conceptually the implementation of Q -learning in a discrete state-action space and illustrates the relation of the technique to the dynamic programm... View full abstract»

• ### Managing Category Proliferation in Fuzzy ARTMAP Caused by Overlapping Classes

Publication Year: 2009, Page(s):1244 - 1253
Cited by:  Papers (1)
| | PDF (437 KB) | HTML

This paper addresses the difficulties brought about by overlapping classes in fuzzy ARTMAP (FAM). Training with such data leads to category proliferation, and classification is made difficult not only by the large number of categories but also the fact that such data can belong to either class. In this paper, changes were proposed to allow more than one class to be predicted during classification,... View full abstract»

• ### A Multiscale Scheme for Approximating the Quantron's Discriminating Function

Publication Year: 2009, Page(s):1254 - 1266
Cited by:  Papers (4)
| | PDF (444 KB) | HTML

Finding an accurate approximation of a discriminating function in order to evaluate its extrema is a common problem in the field of machine learning. A new type of neural network, the Quantron, generates a complicated wave function whose global maximum value is crucial for classifying patterns. To obtain an analytical approximation of this maximum, we present a multiscale scheme based on compactly... View full abstract»

• ### Segmented-Memory Recurrent Neural Networks

Publication Year: 2009, Page(s):1267 - 1280
Cited by:  Papers (5)
| | PDF (814 KB) | HTML

Conventional recurrent neural networks (RNNs) have difficulties in learning long-term dependencies. To tackle this problem, we propose an architecture called segmented-memory recurrent neural network (SMRNN). A symbolic sequence is broken into segments and then presented as inputs to the SMRNN one symbol per cycle. The SMRNN uses separate internal states to store symbol-level context, as well as s... View full abstract»

• ### BAM Learning of Nonlinearly Separable Tasks by Using an Asymmetrical Output Function and Reinforcement Learning

Publication Year: 2009, Page(s):1281 - 1292
Cited by:  Papers (8)
| | PDF (1637 KB) | HTML

Most bidirectional associative memory (BAM) networks use a symmetrical output function for dual fixed-point behavior. In this paper, we show that by introducing an asymmetry parameter into a recently introduced chaotic BAM output function, prior knowledge can be used to momentarily disable desired attractors from memory, hence biasing the search space to improve recall performance. This property a... View full abstract»

• ### Universal Perceptron and DNA-Like Learning Algorithm for Binary Neural Networks: Non-LSBF Implementation

Publication Year: 2009, Page(s):1293 - 1301
Cited by:  Papers (20)
| | PDF (644 KB) | HTML

Implementing linearly nonseparable Boolean functions (non-LSBF) has been an important and yet challenging task due to the extremely high complexity of this kind of functions and the exponentially increasing percentage of the number of non-LSBF in the entire set of Boolean functions as the number of input variables increases. In this paper, an algorithm named DNA-like learning and decomposing algor... View full abstract»

• ### Fuzzy Associative Conjuncted Maps Network

Publication Year: 2009, Page(s):1302 - 1319
Cited by:  Papers (6)
| | PDF (1935 KB) | HTML

The fuzzy associative conjuncted maps (FASCOM) is a fuzzy neural network that associates data of nonlinearly related inputs and outputs. In the network, each input or output dimension is represented by a feature map that is partitioned into fuzzy or crisp sets. These fuzzy sets are then conjuncted to form antecedents and consequences, which are subsequently associated to form if-then rules. The as... View full abstract»

• ### Asymptotic Tracking of Uncertain Systems With Continuous Control Using Adaptive Bounding

Publication Year: 2009, Page(s):1320 - 1329
Cited by:  Papers (14)
| | PDF (890 KB) | HTML

This paper presents a robust adaptive control design method for a class of multiple-input-multiple-output uncertain nonlinear systems in the presence of parametric and nonparametric uncertainties and bounded disturbances. Using the approximation properties of the unknown continuous nonlinearities and the adaptive bounding technique, the developed controller achieves asymptotic convergence of the t... View full abstract»

• ### Stability Analysis of Discrete-Time Recurrent Neural Networks With Stochastic Delay

Publication Year: 2009, Page(s):1330 - 1339
Cited by:  Papers (26)
| | PDF (589 KB) | HTML

This paper is concerned with the stability analysis of discrete-time recurrent neural networks (RNNs) with time delays as random variables drawn from some probability distribution. By introducing the variation probability of the time delay, a common delayed discrete-time RNN system is transformed into one with stochastic parameters. Improved conditions for the mean square stability of these system... View full abstract»

• ### Large Memory Capacity in Chaotic Artificial Neural Networks: A View of the Anti-Integrable Limit

Publication Year: 2009, Page(s):1340 - 1351
Cited by:  Papers (6)
| | PDF (883 KB) | HTML

In the literature, it was reported that the chaotic artificial neural network model with sinusoidal activation functions possesses a large memory capacity as well as a remarkable ability of retrieving the stored patterns, better than the conventional chaotic model with only monotonic activation functions such as sigmoidal functions. This paper, from the viewpoint of the anti-integrable limit, eluc... View full abstract»

• ### Error Minimized Extreme Learning Machine With Growth of Hidden Nodes and Incremental Learning

Publication Year: 2009, Page(s):1352 - 1357
Cited by:  Papers (244)
| | PDF (251 KB) | HTML

One of the open problems in neural network research is how to automatically determine network architectures for given applications. In this brief, we propose a simple and efficient approach to automatically determine the number of hidden nodes in generalized single-hidden-layer feedforward networks (SLFNs) which need not be neural alike. This approach referred to as error minimized extreme learnin... View full abstract»

• ### Model Selection Criteria for Image Restoration

Publication Year: 2009, Page(s):1357 - 1363
Cited by:  Papers (19)
| | PDF (1073 KB) | HTML

In this brief, the image restoration problem is approached as a learning system problem, in which a model is to be selected and parameters are estimated. Although the parameters which correspond to the restored image can easily be obtained, their quality depend heavily on a proper choice of the regularization parameter that controls the tradeoff between fidelity to the blurred noisy observed image... View full abstract»

• ### “Vague-to-Crisp” Neural Mechanism of Perception

Publication Year: 2009, Page(s):1363 - 1367
Cited by:  Papers (43)
| | PDF (355 KB) | HTML

This brief describes neural modeling fields (NMFs) for object perception, a bio-inspired paradigm. We discuss previous difficulties in object perception algorithms encountered since the 1950s, and describe how NMF overcomes these difficulties. NMF mechanisms are compared to recent experimental neuroimaging observations, which have demonstrated that initial top-down signals are vague and during per... View full abstract»

• ### Simple Artificial Neural Networks That Match Probability and Exploit and Explore When Confronting a Multiarmed Bandit

Publication Year: 2009, Page(s):1368 - 1371
Cited by:  Papers (8)
| | PDF (112 KB) | HTML

The matching law (Herrnstein 1961) states that response rates become proportional to reinforcement rates; this is related to the empirical phenomenon called probability matching (Vulkan 2000). Here, we show that a simple artificial neural network generates responses consistent with probability matching. This behavior was then used to create an operant procedure for network learning. We use the mul... View full abstract»

• ### Corrections to “Basins of Attraction in Fully Asynchronous Discrete-Time Discrete-State Dynamic Networks” [Mar 06 397-408]

Publication Year: 2009, Page(s):1372 - 1374
| | PDF (122 KB) | HTML

This paper brings a correction to the formulation of the basins of fixed-point states of fully asynchronous discrete-time discrete-state dynamic networks presented in the above titled paper (ibid., vol. 17, no. 2, pp. 397-408, Mar 06). In our subsequent works on totally asynchronous systems, we have discovered that the formulation given in that previous paper lacks an additional condition. We pres... View full abstract»

• ### 2010 IEEE World Congress on Computational Intelligence (WCCI)

Publication Year: 2009, Page(s): 1375
| PDF (755 KB)

Publication Year: 2009, Page(s): 1376
| PDF (188 KB)
• ### IEEE Computational Intelligence Society Information

Publication Year: 2009, Page(s): C3
| PDF (36 KB)
• ### IEEE Transactions on Neural Networks Information for authors

Publication Year: 2009, Page(s): C4
| PDF (39 KB)

## Aims & Scope

IEEE Transactions on Neural Networks is devoted to the science and technology of neural networks, which disclose significant technical knowledge, exploratory developments, and applications of neural networks from biology to software to hardware.

This Transactions ceased production in 2011. The current retitled publication is IEEE Transactions on Neural Networks and Learning Systems.

Full Aims & Scope