By Topic

Comments on "Learning convergence in the cerebellar model articulation controller"

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Brown, M. ; Dept. of Aeronaut. & Astronaut., Southampton Univ., UK ; Harris, C.J.

The commenter refers to the paper by Wong-Sideris (ibid. vol.3, p.115-21 (1992)) claiming that the original Albus CMAC (or binary CMAC) is capable of learning an arbitrary multivariate lookup table, the linear optimization process is strictly positive definite, and that the basis functions are linearly independent, given sufficient training data. In recent work by Brown et al. (1994), however, it has been proved that the multivariate binary CMAC is unable to learn certain multivariate lookup tables and the number of such orthogonal functions increases exponentially as the generalization parameter is increased. A simple 2D orthogonal function is presented as a counterexample to the original theory. It is also demonstrated that the basis functions are-always linearly dependent, both for the univariate and the multivariate case, and hence the linear optimization process is only positive semi-definite and there always exists an infinite number of possible optimal weight vectors.<>

Published in:

Neural Networks, IEEE Transactions on  (Volume:6 ,  Issue: 4 )