摘要

A general non-asymptotic framework, which evaluates the performance of any procedure at individual functions, is introduced in the context of estimating convex functions at a point. This framework, which is significantly different from the conventional minimax theory, is also applicable to other problems in shape constrained inference. A benchmark is provided for the mean squared error of any estimate for each convex function in the same way that Fisher Information depends on the unknown parameter in a regular parametric model. A local modulus of continuity is introduced and is shown to capture the difficulty of estimating individual convex functions. A fully data-driven estimator is proposed and is shown to perform uniformly within a constant factor of the ideal benchmark for every convex function. Such an estimator is thus adaptive to every unknown function instead of to a collection of function classes as is typical in the nonparametric function estimation literature.

  • 出版日期2015-4