An improved global risk bound in concave regression

作者:Chatterjee Sabyasachi*
来源:Electronic Journal of Statistics, 2016, 10(1): 1608-1629.
DOI:10.1214/16-EJS1151

摘要

A new risk bound is presented for the problem of convex/concave function estimation, using the least squares estimator. The best known risk bound, as had appeared in Guntuboyina and Sen [8], scaled like log( en) n(-4/5) under the mean squared error loss, up to a constant factor. The authors in [8] had conjectured that the logarithmic term may be an artifact of their proof. We show that indeed the logarithmic term is unnecessary and prove a risk bound which scales like n(-4/5) up to constant factors. Our proof technique has one extra peeling step than in a usual chaining type argument. Our risk bound holds in expectation as well as with high probability and also extends to the case of model misspecification, where the true function may not be concave.

  • 出版日期2016