Loading [a11y]/accessibility-menu.js
Near-Optimal Approximation Rates for Distribution Free Learning with Exponentially, Mixing Observations | IEEE Conference Publication | IEEE Xplore

Near-Optimal Approximation Rates for Distribution Free Learning with Exponentially, Mixing Observations


Abstract:

This paper derives the rate of convergence for the distribution free learning problem when the observation process is an exponentially strongly mixing (α-mixing with an e...Show More

Abstract:

This paper derives the rate of convergence for the distribution free learning problem when the observation process is an exponentially strongly mixing (α-mixing with an exponential rate) Markov chain. If {zk}K=1 = {(xk, yk)}k=1 ⊂ x × Y ≡ Z is an exponentially strongly mixing Markov chain with stationary measure ρ, it is shown that the empirical estimate fz that minimizes the discrete quadratic risk satisfies the bound Ez∈Zm (∥ fρ - fzL2(ρx)) ≤ C (lna/a)r/(2r+1) where Ez∈Zm (·) is the expectation over the first m-steps of the chain, fρ is the regressor function in L2(ρX) associated with ρ, r is related to the abstract smoothness of the regressor, ρX is the marginal measure associated with ρ, and a is the rate of concentration of the Markov chain.
Date of Conference: 30 June 2010 - 02 July 2010
Date Added to IEEE Xplore: 29 July 2010
ISBN Information:

ISSN Information:

Conference Location: Baltimore, MD, USA

Contact IEEE to Subscribe

References

References is not available for this document.