摘要

Another hybrid conjugate gradient algorithm is subject to analysis. The parameter beta(k) is computed as a convex combination of beta(HS)(k) (Hestenes-Stiefel) and beta(DY)(k) (Dai-Yuan) algorithms, i.e. beta(C)(k) = (1 - theta(k)) beta(HS)(k) + theta(k)beta(DY)(k) . The parameter theta(k) in the convex combination is computed in such a way so that the direction corresponding to the conjugate gradient algorithm to be the Newton direction and the pair (s(k), y(k)) to satisfy the quasi-Newton equation del(2)f(x(k+1))s(k) = y(k) where s(k) = x(k+1) - x(k) and y(k) = g(k+1) - g(k). The algorithm uses the standard Wolfe line search conditions. Numerical comparisons with conjugate gradient algorithms show that this hybrid computational scheme outperforms the Hestenes-Stiefel and the Dai-Yuan conjugate gradient algorithms as well as the hybrid conjugate gradient algorithms of Dai and Yuan. A set of 750 unconstrained optimization problems are used, some of them from the CUTE library.

  • 出版日期2008-2