By Topic

Neural Networks, IEEE Transactions on

Issue 4 • Date Dec 1990

Filter Results

Displaying Results 1 - 7 of 7
  • Sufficient condition for convergence of a relaxation algorithm in actual single-layer neural networks

    Page(s): 300 - 303
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (332 KB)  

    Application of the contraction mapping theorem to single-layer feedback neural networks of a gradient-type is discussed. The sufficient condition for stability of a relaxation algorithm in actual continuous-time networks is derived and illustrated with an example. Results showing the stability of a numerical solution obtained with the relaxation algorithm are presented View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Neural networks for engineering in medicine and biology

    Page(s): 305 - 306
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (140 KB)  

    The September 1990 special use on neural networks of the IEEE Engineering in Medici and Biology Magazine is described. This issue contains of papers dealing with various aspects and applications of neural networks in a wide spectrum of biomedical engineering problems View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The Stone-Weierstrass theorem and its application to neural networks

    Page(s): 290 - 295
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (516 KB)  

    The Stone-Weierstrass theorem and its terminology are reviewed, and neural network architectures based on this theorem are presented. Specifically, exponential functions, polynomials, partial fractions, and Boolean functions are used to create networks capable of approximating arbitrary bounded measurable functions. A modified logistic network satisfying the theorem is proposed as an alternative to commonly used networks based on logistic squashing functions View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Neuron type processor modeling using a timed Petri net

    Page(s): 282 - 289
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (748 KB)  

    The basic operation of a digital neuron is reviewed, and the theory of time Petri nets used for modeling, representation, and analysis of the neuron-type processor (NTP) is reviewed. The timed Petri net is utilized to produce a model for the digital NTP. The neuron-type processor performs input temporal and spatial summation, as well as thresholding. The timed Petri net of the NTP operates asynchronously and sequentially takes on a series of distinct internal states, so that each of these states can concurrently realize a distinct set of steering switching functions depending on the pattern of steering inputs applied to it at the time. This model is structured using several subnets, called essential module units. Depending on the desired number of input dendrites required for the NTP, the essential module units (EMU) are interconnected to produce the required timed Petri net. The timed Petri net and representation facilitates a method of analysis of neural net works containing NTPs prior to hardware implementation View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Neural network classification: a Bayesian interpretation

    Page(s): 303 - 305
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (336 KB)  

    The relationship between minimizing a mean squared error and finding the optimal Bayesian classifier is reviewed. This provides a theoretical interpretation for the process by which neural networks are used in classification. A number of confidence measures are proposed to evaluate the performance of the neural network classifier within a statistical framework View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An optimum multilayer perceptron neural receiver for signal detection

    Page(s): 298 - 300
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (296 KB)  

    The M-input optimum likelihood-ratio receiver is generalized by considering the case of different signal amplitudes on the receiver primary input lines. Using the more general likelihood-ratio receiver as a reference, an equivalent optimum multilayer perceptron neural network (or neural receiver) is identified for detecting the presence of an M-dimensional target signal corrupted by bandlimited white Gaussian noise. Analytical results are supported by Monte Carlo simulation runs which indicate that the detection capability of the proposed neural receiver is not sensitive to the level of training or number of patterns in the training set View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The multilayer perceptron as an approximation to a Bayes optimal discriminant function

    Page(s): 296 - 298
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (268 KB)  

    The multilayer perceptron, when trained as a classifier using backpropagation, is shown to approximate the Bayes optimal discriminant function. The result is demonstrated for both the two-class problem and multiple classes. It is shown that the outputs of the multilayer perceptron approximate the a posteriori probability functions of the classes being trained. The proof applies to any number of layers and any type of unit activation function, linear or nonlinear View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

IEEE Transactions on Neural Networks is devoted to the science and technology of neural networks, which disclose significant technical knowledge, exploratory developments, and applications of neural networks from biology to software to hardware.

 

This Transactions ceased production in 2011. The current retitled publication is IEEE Transactions on Neural Networks and Learning Systems.

Full Aims & Scope