By Topic

Global Boltzmann perceptron network for online learning of conditional distributions

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Thathachar, M. ; Dept. of Electr. Eng., Indian Inst. of Sci., Bangalore, India ; Arvind, M.T.

This paper proposes a backpropagation-based feedforward neural network for learning probability distributions of outputs conditioned on inputs using incoming input-output samples only. The backpropagation procedure is shown to locally minimize the Kullback-Leibler measure in an expected sense. The procedure is enhanced to facilitate boundedness of weights and exploration of the search space to reach a global minimum. The weak convergence theory is employed to show that the long-term behavior of the resulting algorithm can be approximated by that of a stochastic differential equation, whose invariant distributions are concentrated around the global minima of the Kullback-Leibler measure within a region of interest. Simulation studies on problems involving samples arriving from a mixture of labeled densities and the well-known Iris data problem demonstrate the speed and accuracy of the proposed procedure

Published in:

Neural Networks, IEEE Transactions on  (Volume:10 ,  Issue: 5 )