By Topic

Maximum a posteriori decision and evaluation of class probabilities by Boltzmann perceptron classifiers

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Yair, E. ; IBM Sci. Center, Haifa, Israel ; Gersho, A.

It is shown that neural network architectures may offer a valuable alternative to the Bayesian classifier. With neural networks, the a posteriori probabilities are computed with no a priori assumptions about the probability distribution functions (PDFs) that generate the data. Rather than assuming certain types of PDFs for the input data, the neural classifier uses a general type of input-output mapping which is then designed to optimally comply with a given set of examples called the training set. It is demonstrated that the a posteriori class probabilities can be efficiently computed by a deterministic feedforward network which is called the Boltzmann perceptron classifier (BPC). Maximum a posteriori (MAP) classifiers are also constructed as a special case of the BPC. Structural relationships between the BPC and a conventional multilayer perceptron (MLP) are given, and it is demonstrated that rather intricate boundaries between classes can be formed even with a relatively modest number of network units. Simulation results show that the BPC is comparable in performance to a Bayesian classifier

Published in:

Proceedings of the IEEE  (Volume:78 ,  Issue: 10 )