By Topic

Bayesian alternatives to neural computing

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
J. C. Westland ; Sch. of Bus. Adm., Univ. of Southern California, Los Angeles, CA, USA

This paper investigates two types of neural organization-Hebbian and perceptron learning. Hebbian neural learning merely serves to summarize input, whereas perceptron learning adjusts to meet systems objectives. Bayesian models have also been proposed as archetypes for human learning, providing decisions in an uncertain environment. Bayesian analogs to Hebbian and perceptron learning were constructed and found to respond more smoothly and predictably than neural models. They tend to discount information that is already known and provide smoother transitions from one revision to the next, even at relatively high learning rates. When a Bayesian system receives no evidence about a given parameter, activation levels decline, and the system “forgets”. The Bayesian analogs of Hebbian and perceptron learning move roughly in tandem with neural networks, and yield similar decisions. But Bayesian models offer statistical performance metrics useful in the design and development of systems. The Bayesian analogs retain the features that attract engineers to neural networks, while dispelling uneasiness associated with the “black box” character of neural systems

Published in:

IEEE Transactions on Systems, Man, and Cybernetics  (Volume:25 ,  Issue: 1 )