Skip to Main Content
In this paper, we address the problem of nonparametric regression estimation in the infinite-dimensional setting. We start by extending the Stone's seminal result to the case of metric spaces when the probability measure of the explanatory variables is tight. Then, under slight variations on the hypotheses, we state and prove the theorem for general metric measure spaces. From this result, we derive the mean square consistency of the k-NN and kernel estimators if the regression function is bounded and the Besicovitch condition holds. We also prove that, for the uniform kernel estimate, the Besicovitch condition is also necessary in order to attain L1 consistency for almost every x.