摘要

In this paper, we address the problem of regression estimation in the context of a p-dimensional predictor when p is large. We propose a general model in which the regression function is a composite function. Our model consists in a nonlinear extension of the usual sufficient dimension reduction setting. The strategy followed for estimating the regression function is based on the estimation of a new parameter, called the reduced dimension. We adopt a minimax point of view and provide both lower and upper bounds for the optimal rates of convergence for the estimation of the regression function in the context of our Model. We prove that our estimate adapts, in the minimax sense, to the unknown value d of the reduced dimension and achieves therefore fast rates of convergence when d << p.

  • 出版日期2014-7

全文