By Topic

Kernel covering algorithm and a design principle for feed-forward neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Gaowei Wu ; Inst. of Autom., Chinese Acad. of Sci., Beijing, China ; Qing Tao ; Jue Wang

Kernel technique supplies a systematic and principled approach to training learning machines and the good generalization performance achieved can be readily justified using statistical learning theory. In this paper, we convert classification problem into a set cover one, present a kernel covering algorithm which combines kernel technique with covering approach. This algorithm is constructive, and bypasses the problems of convergence and convergence speed. Analyzing the statistical properties of the covering classifier, we offer a bound of the actual risk. In virtue of the variety of kernels, a general design principle for feed-forward neural networks is drawn.

Published in:

Neural Information Processing, 2002. ICONIP '02. Proceedings of the 9th International Conference on  (Volume:2 )

Date of Conference:

18-22 Nov. 2002