Skip to Main Content
The analysis on the learning convergence of CMAC in frequency domain is firstly extended to a more general case where the training samples are evenly distributed in the quantitative range and the learning rate is other than one. The convergence condition is presented and the influence of the learning rate beta on the convergence range is analyzed. If 0< beta<1, CMAC is convergent in the whole frequency domain. If 1lesbeta<2, the convergence of CMAC will become more unstable with beta becoming larger. To overcome this problem, a modified algorithm is proposed and simulation results prove the stability of CMAC can be improved significantly.