摘要

We devise a new generalized univariate Newton method for solving nonlinear equations, motivated by Bregman distances and proximal regularization of optimization problems. We prove quadratic convergence of the new method, a special instance of which is the classical Newton method. We illustrate the possible benefits of the new method over the classical Newton method by means of test problems involving the Lambert W function, Kullback-Leibler distance, and a polynomial. These test problems provide insight as to which instance of the generalized method could be chosen for a given nonlinear equation. Finally, we derive a closed-form expression for the asymptotic error constant of the generalized method and make further comparisons involving this constant.

  • 出版日期2012-12