By Topic

Agnostically learning halfspaces

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Kalai, A.T. ; TTI-Chicago, Chicago, IL, USA ; Klivans, A.R. ; Mansour, Y. ; Servedio, R.A.

We give the first algorithm that (under distributional assumptions) efficiently learns halfspaces in the notoriously difficult agnostic framework of Kearns, Schapire, & Sellie, where a learner is given access to labeled examples drawn from a distribution, without restriction on the labels (e.g. adversarial noise). The algorithm constructs a hypothesis whose error rate on future examples is within an additive ε of the optimal halfspace, in time poly(n) for any constant ε > 0, under the uniform distribution over {-1, 1}n or the unit sphere in Rn , as well as under any log-concave distribution over R n. It also agnostically learns Boolean disjunctions in time 2O~(√n) with respect to any distribution. The new algorithm, essentially L1 polynomial regression, is a noise-tolerant arbitrary distribution generalization of the "low degree" Fourier algorithm of Linial, Mansour, & Nisan. We also give a new algorithm for PAC learning halfspaces under the uniform distribution on the unit sphere with the current best bounds on tolerable rate of "malicious noise".

Published in:

Foundations of Computer Science, 2005. FOCS 2005. 46th Annual IEEE Symposium on

Date of Conference:

23-25 Oct. 2005