By Topic

Towards more practical average bounds on supervised learning

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Hanzhong Gu ; Dept. of Commun. & Syst. Eng., Univ. of Electro-Commun., Chofu, Japan ; Takahashi, H.

In this paper, we describe a method which enables us to study the average generalization performance of learning directly via hypothesis testing inequalities. The resulting theory provides a unified viewpoint of average-case learning curves of concept learning and regression in realistic learning problems not necessarily within the Bayesian framework. The advantages of the theory are that it alleviates the practical pessimism frequently claimed for the results of the Vapnik-Chervonenkis (VC) theory and its alike, and provides general insights into generalization. Besides, the bounds on learning curves are directly related to the number of adjustable system weights. Although the theory is based on an approximation assumption, and cannot apply to the worst-case learning setting, the precondition of the assumption is mild, and the approximation itself is only a sufficient condition for the validity of the theory. We illustrate the results with numerical simulations, and apply the theory to examining the generalization ability of combination of neural networks

Published in:

Neural Networks, IEEE Transactions on  (Volume:7 ,  Issue: 4 )