By Topic

On the Asymptotic Improvement in the Out- come of Supervised Learning Provided by Additional Nonsupervised Learning

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)

This paper treats an aspect of the learning or estimation phase of statistical pattern recognition (and adaptive statistical decision making in general). Simple mathematical expressions are derived for the improvement in supervised learning provided by additional nonsupervised learning when the number of learning samples is large so that asymptotic approximations are appropriate. The paper consists largely of the examination of a specific example, but, as is briefly discussed, the same procedure can be applied to other parametric problems and generalization to nonparametric problems seems possible. The example treated has the additional interesting aspect that the data does not have structure that would enable the machine to learn in the nonsupervised mode alone; but the additional nonsupervised learning can provide substantial improvement over the results obtainable by supervised learning alone. A second purpose of the paper is to suggest that a new fruitful area of research is the analytical study of the possible benefits of combining supervised and nonsupervised learning.

Published in:

IEEE Transactions on Computers  (Volume:C-19 ,  Issue: 11 )