Abstract:
In this paper, we introduce the so-called hierarchical interaction models, where we assume that the computation of the value of a function m : ℝd → ℝ is done in several l...Show MoreMetadata
Abstract:
In this paper, we introduce the so-called hierarchical interaction models, where we assume that the computation of the value of a function m : ℝd → ℝ is done in several layers, where in each layer a function of at most d* inputs computed by the previous layer is evaluated. We investigate two different regression estimates based on polynomial splines and on neural networks, and show that if the regression function satisfies a hierarchical interaction model and all occurring functions in the model are smooth, the rate of convergence of these estimates depends on d* (and not on d). Hence, in this case, the estimates can achieve good rate of convergence even for large d, and are in this sense able to circumvent the so-called curse of dimensionality.
Published in: IEEE Transactions on Information Theory ( Volume: 63, Issue: 3, March 2017)