摘要

We consider nonparametric estimation problems in the presence of dependent data, notably nonparametric regression with random design and nonparametric density estimation. The proposed estimation procedure is based on a dimension reduction. The minimax optimal rate of convergence of the estimator is derived assuming a sufficiently weak dependence characterised by fast decreasing mixing coefficients. We illustrate these results by considering classical smoothness assumptions. However, the proposed estimator requires an optimal choice of a dimension parameter depending on certain characteristics of the function of interest, which are not known in practice. The main issue addressed in our work is an adaptive choice of this dimension parameter combining model selection and Lepski's method. It is inspired by the recent work of Goldenshluger and Lepski [(2011), Bandwidth Selection in Kernel Density Estimation: Oracle Inequalities and Adaptive Minimax Optimality', The Annals of Statistics, 39, 1608-1632]. We show that this data-driven estimator can attain the lower risk bound up to a constant provided a fast decay of the mixing coefficients.

  • 出版日期2017