Proceedings of the International Joint Conference on Neural Networks, 2003.

20-24 July 2003

Go

Filter Results

Displaying Results 1 - 25 of 153
  • A study on on-line learning of NNTrees

    Publication Year: 2003, Page(s):2540 - 2545 vol.4
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (414 KB) | HTML iconHTML

    A neural network tree (NNTree) is a hybrid learning model with the overall structure being a decision tree (DT), and each nonterminal node containing a neural network (NN). Using NNTrees, it is possible to learn new knowledge online by adjusting the NNs in the nonterminal nodes. It is also possible to understand the learned knowledge online because the NNs in the nonterminal nodes are usually very... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Noise supplement learning algorithm for associative memories using multilayer perceptrons and sparsely interconnected neural networks

    Publication Year: 2003, Page(s):2534 - 2539 vol.4
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (383 KB) | HTML iconHTML

    At present, we have proposed associative memories using multilayer perceptrons (MLPs) and sparsely interconnected neural networks (SINNs), named MLP-SINN, to improve SINNs without increasing their interconnections. MLP-SINN is more suitable for hardware implementation than SINN with a large number of interconnections. However, the capabilities of MLP and SINN are not effectively used in the conven... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Generalized associative memory models for data fusion

    Publication Year: 2003, Page(s):2528 - 2533 vol.4
    Cited by:  Patents (14)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (584 KB) | HTML iconHTML

    The Hopfield and bi-directional associative memory (BAM) models are well developed and carefully studied models for associative memory that are patterned after the memory structure of the animal brain. Their basic limitation is that they can only perform associations between at most two sets of patterns. Several different models for generalized associative memory are proposed. These models are all... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Radial basis network approach for nonlinear filtering in discrete time

    Publication Year: 2003, Page(s):2433 - 2437 vol.4
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (406 KB) | HTML iconHTML

    This paper presents a new method to deal with nonlinear filtering problems in discrete time. Our approach is based on radial basis neural networks and on the principle of particles filters. More precisely, the usual learning phase of the network is replaced by the generation of a lot of particles, i.e. simulated system trajectories. Particles so generated correspond to neural centers. Inspite of i... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Learning-possibility of neuron model can recognize depth-rotation in three-dimension space

    Publication Year: 2003, Page(s):2523 - 2527 vol.4
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (325 KB) | HTML iconHTML

    We propose a neuron model to learn depth-rotation movement in three-dimensional space. The neuron model imitating neuron structure has a system resembling a neuron. We consider a neuron system, and expect to examine whether the system has reasonable function or not. Koch, Poggio and Torre (1982) believed that inhibition signal would shunt excitation signal on the dendrites, and signal functions as... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Associative memory using ratio rule for multi-valued pattern association

    Publication Year: 2003, Page(s):2518 - 2522 vol.4
    Cited by:  Papers (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (413 KB) | HTML iconHTML

    A novel learning algorithm, named ratio rule, for association of multi-valued patterns in a recurrent neural network is proposed in this paper. The learning is performed based on the degree of similarity between the relative magnitudes of the output of each neuron with respect to that of all other neurons. The dynamics of the neural network functions as a line attractor as opposed to the common co... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Growing neural network for acquisition of 2-layer structure

    Publication Year: 2003, Page(s):2512 - 2517 vol.4
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (410 KB) | HTML iconHTML

    Neural networks are broadly used to approximate non-linear functions. However, it is difficult to decide an appropriate structure for a given problem. In this paper, "growing neural network" is proposed as an extension of backpropagation (BP) learning. The propagated error signal is diffused from a target neuron as a substance. The axon of a growing neuron grows according to the concentration grad... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Finding least cost proofs using high order recurrent networks

    Publication Year: 2003, Page(s):2803 - 2806 vol.4
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (326 KB) | HTML iconHTML

    Cost-based abduction (CBA) is an important AI formalism for representing knowledge under uncertainty. In this formalism, evidence to be explained is treated as a goal to be proven, proofs have costs based on how much needs to be assumed to complete the proof, and the set of assumptions needed to complete the least-cost proof are taken as the best explanation for the given evidence. The problem of ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An efficient algorithm on multi-class support vector machine model selection

    Publication Year: 2003, Page(s):3229 - 3232 vol.4
    Cited by:  Papers (2)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (377 KB) | HTML iconHTML

    Support vector machines (SVM) are very effective for general purpose pattern recognition. With carefully selected models, they have won many benchmark applications over conventional classification techniques. Current SVM model selection schemes are time consuming when they are applied to binary classification. It is practically impossible to apply these methods to multi-class SVM for detailed mode... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Detecting rare events with lotto-type competitive learning

    Publication Year: 2003, Page(s):2506 - 2511 vol.4
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (460 KB) | HTML iconHTML

    This paper highlights the difficulty of detecting small and rare clusters. In theory it is possible for individual neurons, in lotto-type competitive learning algorithms, to follow the source density function. We note that in experiment it is very difficult to locate these small clusters, especially if the prototype set is limited. Two methods are proposed, as exploratory tools, to locate these cl... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Learning communities: connectivity and dynamics of interacting agents

    Publication Year: 2003, Page(s):2797 - 2802 vol.4
    Cited by:  Papers (6)  |  Patents (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (440 KB) | HTML iconHTML

    Intelligent agents need to learn how the communication structure evolves within interacting groups and how to influence the groups overall behavior. We are developing methods to automatically and unobtrusively learn the social network structure that arises within a human group based on wearable sensors. Computational models of group interaction dynamics are derived from data gathered using wearabl... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Neural networks and rule extraction for prediction and explanation in the marketing domain

    Publication Year: 2003, Page(s):2866 - 2871 vol.4
    Cited by:  Papers (4)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (428 KB) | HTML iconHTML

    This paper contains a case study where neural networks are used for prediction and explanation in the marketing domain. Initially, neural networks are used for regression and classification to predict the impact of advertising from money invested in different media categories. Rule extraction is then performed on the trained networks, using the G-REX method, which is based on genetic programming. ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Classifiability based omnivariate decision trees

    Publication Year: 2003, Page(s):3223 - 3228 vol.4
    Cited by:  Papers (1)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (451 KB) | HTML iconHTML

    Decision trees represent a simple and powerful method of induction from labeled examples. Univariate decision trees consider the value of a single attribute at each node, leading to the splits that are parallel to the axes. In linear multivariate decision trees, all the attributes are used and the partition at each node is based on a linear discriminate (a hyperplane). Nonlinear multivariate decis... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A neural network for the typicality effects

    Publication Year: 2003, Page(s):2502 - 2505 vol.4
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (343 KB) | HTML iconHTML

    The typicality effect is an important finding on the internal structure of concept hierarchies in the field of cognitive psychology. It says that instances of any concept are mentally ordered with respect to their "typicality". The typicality affects mental behaviour of humans. For example, if asked to list up all the instances of a concept, subjects retrieve them in descending order of their typi... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Generating structure in sensory data through coordinated motor activity

    Publication Year: 2003
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (195 KB)

    Evidence from human and animal studies suggests that neural and cognitive development unfolds in the course of active exploration of the sensory environment. We argue that the statistical structure of sensory inputs depends critically on coordinated motor activity (Lungarella, and Pfeifer, 2001). We develop a set of statistical measures to objectively characterize streams of sensory data from an i... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • SVM learning from large training data set

    Publication Year: 2003, Page(s):2860 - 2865 vol.4
    Cited by:  Papers (5)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (403 KB) | HTML iconHTML

    Support vector machines have been gaining popularity in the research community of pattern classification. In this paper, we investigate efficient and effective algorithms for training SVMs on large data collections. We decompose SVM learning problem into two stages. At the first stage we developed an algorithm that uses a sequence of small subsets of training data to select the parameters γ ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Combining evolving neural network classifiers using bagging

    Publication Year: 2003, Page(s):3218 - 3222 vol.4
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (401 KB) | HTML iconHTML

    The performance of the neural network classifier significantly depends on its architecture and generalization. It is usual to find the proper architecture by trial and error. This is time consuming and may not always find the optimal network. For this reason, we apply genetic algorithms to the automatic generation of neural networks. Many researchers have provided that combining multiple classifie... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Mel-frequency cepstrum coefficients extraction from infant cry for classification of normal and pathological cry with feed-forward neural networks

    Publication Year: 2003, Page(s):3140 - 3145 vol.4
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (471 KB) | HTML iconHTML

    This work presents the development of an automatic recognition system of infant cry, with the objective to classify two types of cry: normal and pathological cry from deaf babies. In this study, we used acoustic characteristics obtained by the mel-frequency cepstrum technique and as a classifier a feedforward neural network that was trained with several learning methods, resulting in a better scal... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Adaptive vs. accommodative neural networks for adaptive system identification: part II

    Publication Year: 2003, Page(s):2497 - 2501 vol.4
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (382 KB) | HTML iconHTML

    Adaptive neural networks (i.e. NNs with long- and short-term memories), and accommodative neural networks, which are recurrent NNs with fixed weights, are perhaps the most effective paradigms for general and systematic adaptive series-parallel system identification. Adaptive NNs involve less online computation, no poor local minima to fall into, and much more timely and better adaptation than neur... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Flexible self-organizing maps by information maximization

    Publication Year: 2003, Page(s):2734 - 2739 vol.4
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (466 KB) | HTML iconHTML

    In this paper, we propose a new information theoretic method for self-organizing maps. The method aims to control competitive processes flexibly, that is, to produce different competitive unit activations according to information content obtained in learning. Competition is realized by maximizing mutual information between input patterns and competitive units. Competitive unit outputs are computed... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Analyzing dividend events with neural network rule extraction

    Publication Year: 2003, Page(s):2854 - 2859 vol.4
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (451 KB) | HTML iconHTML

    Over the last two decades, artificial neural networks (ANN) have been applied to solve a variety of problems such as pattern classification and function approximation. In many applications, it is desirable to extract knowledge from trained neural networks for the users to gain a better understanding of the network's solution. In this paper, we apply REFANN (rule extraction from function approximat... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Statistical learning for detecting protein-DNA-binding sites

    Publication Year: 2003, Page(s):2940 - 2945 vol.4
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (511 KB) | HTML iconHTML

    Detecting the sites on genomic DNA at which DNA binding proteins bind is a highly relevant task in bioinformatics. For example, the binding sites of transcription factors are key elements of regulatory networks and determine the location of genes on a genome. Usually, for a given DNA binding protein, only a few DNA-subsequences at which the protein binds are known experimentally. The task then is ... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • MMI-based training for a probabilistic neural network

    Publication Year: 2003, Page(s):2661 - 2666 vol.4
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (459 KB) | HTML iconHTML

    Probabilistic neural networks (PNNs) that incorporate the Bayesian decision rule and statistical models have been widely used for pattern classification. Efficient estimation of the PNN's weights, however, is still a major problem. In this paper, we propose a new training scheme based on a discriminative criterion, maximum mutual information (MMI), and apply this method to the log-linearized Gauss... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The optimization of radial basis probabilistic neural networks based on genetic algorithms

    Publication Year: 2003, Page(s):3213 - 3217 vol.4
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (364 KB) | HTML iconHTML

    In this paper, a genetic algorithm (GA) is introduced into optimizing the radial basis probabilistic neural networks (RBPNN). The encoding method proposed in this paper involves not only the number and the locations of selected hidden centers but also the shape parameter of the Gaussian kernel function. We use the telling-two-spirals-apart problem as an example to validate the genetic algorithm fo... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Incorporating invariants in Mahalanobis distance based classifiers: application to face recognition

    Publication Year: 2003, Page(s):3118 - 3123 vol.4
    Cited by:  Papers (6)
    Request permission for commercial reuse | Click to expandAbstract | PDF file iconPDF (411 KB) | HTML iconHTML

    We present a technique for combining prior knowledge about transformations that should be ignored with a covariance matrix estimated from training data to make an improved Mahalanobis distance classifier. Modern classification problems often involve objects represented by high-dimensional vectors or images (for example, sampled speech or human faces). The complex statistical structure of these rep... View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.