摘要

Least squares methods are effective for solving systems of partial differential equations. In the case of nonlinear systems the equations are usually linearized by a Newton iteration or successive substitution method, and then treated as a linear least squares problem. We show that it is often advantageous to form a sum of squared residuals first, and then compute a zero of the gradient with a Newton-like method. We present an effective method, based on Sobolev gradients, for treating the nonlinear least squares problem directly. The method is based on trust-region subproblems defined by a Sobolev norm and solved by a preconditioned conjugate gradient method with an effective preconditioner that arises naturally from the Sobolev space setting. The trust-region method is shown to be equivalent to a Levenberg-Marquardt method which blends a Newton or Gauss-Newton iteration with a gradient descent iteration, but uses a Sobolev gradient in place of the Euclidean gradient. We also provide an introduction to the Sobolev gradient method and discuss its relationship to operator preconditioning with equivalent operators.

  • 出版日期2013-3