By Topic

Integration of CMAC technique and weighted regression for efficient learning and output differentiability

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Chun-shin Lin ; Dept. of Electr. Eng., Missouri Univ., Columbia, MO, USA ; Ching-Tsan Chiang

Cerebellar model articulation controllers (CMAC) have attractive properties of learning convergence and speed. Many studies have used CMAC in learning control and demonstrated successful results. However, due to the fact that CMAC is a table lookup technique, a model implemented by a CMAC does not provide a derivative of its output. This is an inconvenience when using CMAC in learning structures that require such derivatives. This paper presents a new scheme that integrates the CMAC addressing technique with weighted regression to resolve this problem. Derivatives exist everywhere except on the boundaries of quantized regions. Compared with the conventional CMAC, the new scheme requires the same amount of memory and has similar learning speed, but provides output differentiability and more precise output. Compared with the typical weighted regression technique, the new scheme offers an efficient way to organize and utilize collected information

Published in:

Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on  (Volume:28 ,  Issue: 2 )