Skip to Main Content
Kernel technique supplies a systematic and principled approach to training learning machines and the good generalization performance achieved can be readily justified using statistical learning theory. In this paper, we convert classification problem into a set cover one, present a kernel covering algorithm which combines kernel technique with covering approach. This algorithm is constructive, and bypasses the problems of convergence and convergence speed. Analyzing the statistical properties of the covering classifier, we offer a bound of the actual risk. In virtue of the variety of kernels, a general design principle for feed-forward neural networks is drawn.