By Topic

Regularization for the kernel recursive least squares CMAC

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
C. Laufer ; Electrical and Electronic Engineering Department, University of Auckland, New Zealand ; G. Coghill

The Cerebellar Model Articulation Controller (CMAC) neural network is an associative memory that is biologically inspired by the cerebellum, which is found in the brains of animals. In recent works, the kernel recursive least squares CMAC (KRLS-CMAC) was proposed as a superior alternative to the standard CMAC as it converges faster, does not require tuning of a learning rate parameter, and is much better at modeling. The KRLS-CMAC however, still suffered from the learning interference problem. Learning interference was addressed in the standard CMAC by regularization. Previous works have also applied regularization to kernelized CMACs, however they were not computationally feasible for large resolutions and dimensionalities. This paper brings the regularization technique to the KRLS-CMAC in a way that allows it to be used efficiently in multiple dimensions with infinite resolution kernel functions.

Published in:

The 2012 International Joint Conference on Neural Networks (IJCNN)

Date of Conference:

10-15 June 2012