Issue 4 • Date Dec 1990
Sufficient condition for convergence of a relaxation algorithm in actual single-layer neural networksPage(s): 300 - 303
Application of the contraction mapping theorem to single-layer feedback neural networks of a gradient-type is discussed. The sufficient condition for stability of a relaxation algorithm in actual continuous-time networks is derived and illustrated with an example. Results showing the stability of a numerical solution obtained with the relaxation algorithm are presented View full abstract»
The September 1990 special use on neural networks of the IEEE Engineering in Medici and Biology Magazine is described. This issue contains of papers dealing with various aspects and applications of neural networks in a wide spectrum of biomedical engineering problems View full abstract»
The Stone-Weierstrass theorem and its terminology are reviewed, and neural network architectures based on this theorem are presented. Specifically, exponential functions, polynomials, partial fractions, and Boolean functions are used to create networks capable of approximating arbitrary bounded measurable functions. A modified logistic network satisfying the theorem is proposed as an alternative to commonly used networks based on logistic squashing functions View full abstract»
The basic operation of a digital neuron is reviewed, and the theory of time Petri nets used for modeling, representation, and analysis of the neuron-type processor (NTP) is reviewed. The timed Petri net is utilized to produce a model for the digital NTP. The neuron-type processor performs input temporal and spatial summation, as well as thresholding. The timed Petri net of the NTP operates asynchronously and sequentially takes on a series of distinct internal states, so that each of these states can concurrently realize a distinct set of steering switching functions depending on the pattern of steering inputs applied to it at the time. This model is structured using several subnets, called essential module units. Depending on the desired number of input dendrites required for the NTP, the essential module units (EMU) are interconnected to produce the required timed Petri net. The timed Petri net and representation facilitates a method of analysis of neural net works containing NTPs prior to hardware implementation View full abstract»
The relationship between minimizing a mean squared error and finding the optimal Bayesian classifier is reviewed. This provides a theoretical interpretation for the process by which neural networks are used in classification. A number of confidence measures are proposed to evaluate the performance of the neural network classifier within a statistical framework View full abstract»
M-input optimum likelihood-ratio receiver is generalized by considering the case of different signal amplitudes on the receiver primary input lines. Using the more general likelihood-ratio receiver as a reference, an equivalent optimum multilayer perceptron neural network (or neural receiver) is identified for detecting the presence of an M-dimensional target signal corrupted by bandlimited white Gaussian noise. Analytical results are supported by Monte Carlo simulation runs which indicate that the detection capability of the proposed neural receiver is not sensitive to the level of training or number of patterns in the training set View full abstract»
The multilayer perceptron, when trained as a classifier using backpropagation, is shown to approximate the Bayes optimal discriminant function. The result is demonstrated for both the two-class problem and multiple classes. It is shown that the outputs of the multilayer perceptron approximate the a posteriori probability functions of the classes being trained. The proof applies to any number of layers and any type of unit activation function, linear or nonlinear View full abstract»
Aims & Scope
IEEE Transactions on Neural Networks is devoted to the science and technology of neural networks, which disclose significant technical knowledge, exploratory developments, and applications of neural networks from biology to software to hardware.
This Transactions ceased production in 2011. The current retitled publication is IEEE Transactions on Neural Networks and Learning Systems.