Skip to Main Content
The Cerebellar Model Articulation Controller (CMAC) neural network is an associative memory that is biologically inspired by the cerebellum, which is found in the brains of animals. In recent works, the kernel recursive least squares CMAC (KRLS-CMAC) was proposed as a superior alternative to the standard CMAC as it converges faster, does not require tuning of a learning rate parameter, and is much better at modeling. The KRLS-CMAC however, still suffered from the learning interference problem. Learning interference was addressed in the standard CMAC by regularization. Previous works have also applied regularization to kernelized CMACs, however they were not computationally feasible for large resolutions and dimensionalities. This paper brings the regularization technique to the KRLS-CMAC in a way that allows it to be used efficiently in multiple dimensions with infinite resolution kernel functions.