摘要

A new three-term conjugate gradient algorithm which satisfies both the descent condition and the conjugacy condition is presented. The algorithm is obtained by minimization the one-parameter quadratic model of the objective function in which the symmetrical approximation of the Hessian matrix satisfies the general quasi-Newton equation. The search direction is obtained by symmetrization of the iteration matrix corresponding to the solution of the quadratic model minimization. Using the general quasi-Newton equation the search direction includes a parameter which is determined by the minimization of the condition number of the iteration matrix. It is proved that this direction satisfies both the conjugacy and the descent condition. The new approximation of the minimum is obtained by the general Wolfe line search using by now a standard acceleration technique. Under standard assumptions, for uniformly convex functions the global convergence of the algorithm is proved. The numerical experiments using 800 large-scale unconstrained optimization test problems show that minimization of the condition number of the iteration matrix lead us to a value of the parameter in the search direction able to define a competitive three-term conjugate gradient algorithm. Numerical comparisons of this variant of the algorithm versus known conjugate gradient algorithms ASCALCG, CONMIN, TTCG and THREECG, as well as the limited memory quasi-Newton algorithm LBFGS (m = 5) and the truncated Newton TN show that our algorithm is indeed more efficient and more robust.

  • 出版日期2015-2