摘要

Another hybrid conjugate gradient algorithm is suggested in this paper. The parameter beta(k) is computed as a convex combination of beta(HS)(k) (Hestenes-Stiefel) and beta(DY)(k) (Dai-Yuan) formulae, i.e. beta(C)(k) = (1-theta(k))beta(HS)(k) + theta(k)beta(DY)(k). The parameter theta(k) in the convex combination is computed in such a way so that the direction corresponding to the conjugate gradient algorithm to be the Newton direction and the pair (S(k), y(k)) to satisfy the modified secant condition given by Zhang et al. [32] and Zhang and Xu [33], where S(k) = x(k+1) - x(k) and y(k) = g(k+1) - g(k). The algorithm uses the standard Wolfe line search conditions. Numerical comparisons with conjugate gradient algorithms show that this hybrid computational scheme outperforms a variant of the hybrid conjugate gradient algorithm given by Andrei [6], in which the pair (s(k), y(k)) satisfies the secant condition del(2)f(x(k+1))s(k) = y(k), as well as the Hestenes-Stiefel, the Dai-Yuan conjugate gradient algorithms, and the hybrid conjugate gradient algorithms of Dai and Yuan. A set of 750 unconstrained optimization problems are used, some of them from the CUTE library.

  • 出版日期2008-12