摘要

In this paper we consider discrete inverse problems for which noise becomes negligible compared to data with increasing model norm. We introduce two novel definitions of regularization for characterizing inversion methods which provide approximations of ill-conditioned inverse operators consistent with such noisy data. In particular, these definitions, respectively, require that the reconstruction error computed from normalized data (p-asymptotic regularization) and the relative reconstruction error (p-relative regularization) go to zero as the model norm tends to infinity, 0 <= p < 1 being a parameter controlling the increase rate of the noise level. We investigate the relationship between these two definitions and we prove that they are all equivalent for positively homogeneous iterative algorithms with suitable stopping rules. This result has as a crucial consequence that such iterative algorithms realize regularization independently of the noise model. Then we give sufficient conditions for such methods to be p-asymptotic and p-relative regularizations in a discrete setting and we prove that the classical expectation maximization algorithm for Poisson data and the Landweber algorithm, if suitably stopped, are regularization methods in this sense. We perform numerical simulations in the case of image deconvolution and computerized tomography to show that, in the presence of model-dependent noise, the reconstructions provided by the above mentioned methods improve with increasing model norm as required by the p-asymptotic and p-relative regularization properties. More extensive studies on the p-asymptotic and p-relative regularizations for Tikhonov-type methods will be the object of future work.

  • 出版日期2017
  • 单位INRIA