By Topic

Neural networks: binary monotonic and multiple-valued

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Zurada, J.M. ; Dept. of Electr. & Comput. Eng., Louisville Univ., KY, USA

This paper demonstrates how conventional neural networks can be modified, extended or generalized by introducing basic notions of multiple-valued logic to the definition of neurons. It has been shown that multilevel neurons produce useful attractor-type neural networks and lead to multistable memory cells. This opens up a possibility of storing a multiplicity of logic levels in a “generalized” Hopfield memory. Another interesting attractor-type network encodes information in complex output values of the neurons, and specifically, in their phase angles. This network working as a memory is able to recognize many stored grey-level values as output of a single neuron. As such, this network represents an extension of bivalent information processors. Multilevel neurons can also be employed in perceptron type classifiers trained with the error backpropagation algorithm. This offers the advantage that the resulting networks are smaller, with fewer weights and neurons to perform typical classification tasks. This improvement is achieved at a cost of considerable enhancement to the neurons' activation functions

Published in:

Multiple-Valued Logic, 2000. (ISMVL 2000) Proceedings. 30th IEEE International Symposium on

Date of Conference:

2000