摘要

An accelerated hybrid conjugate gradient algorithm represents the subject of this paper. The parameter beta k is computed as a convex combination of beta(HS)(K) (Hestenes and Stiefel, J Res Nat Bur Stand 49: 409 -436, 1952) and beta(DY)(k) (Dai and Yuan, SIAM J Optim 10: 177-182, 1999), i.e. beta(C)(k) = (1 - theta k) beta(HS)(k) + theta(k)beta(DY)(k). The parameter theta(k) in the convex combinaztion is computed in such a way the direction corresponding to the conjugate gradient algorithm is the best direction we know, i.e. the Newton direction, while the pair (s(k), y(k)) satisfies the modified secant condition given by Li et al. (J Comput Appl Math 202:523-539, 2007) B(k+ 1)s(k) = z(k), where z(k) = y(k) + (eta(k)/parallel to s(k)parallel to(2))s(k), eta(k) = 2 (f(k) - f(k+ 1)) + (g(k) + g(k+ 1))(T) s(k), s(k) = x(k+ 1) - x(k) and y(k) = g(k+ 1) - g(k). It is shown that both for uniformly convex functions and for general nonlinear functions the algorithm with strong Wolfe line search is globally convergent. The algorithm uses an acceleration scheme modifying the steplength alpha(k) for improving the reduction of the function values along the iterations. Numerical comparisons with conjugate gradient algorithms show that this hybrid computational scheme outperforms a variant of the hybrid conjugate gradient algorithm given by Andrei (Numer Algorithms 47:143-156, 2008), in which the pair (s(k), y(k)) satisfies the classical secant condition B(k+ 1)s(k) = y(k), as well as some other conjugate gradient algorithms including Hestenes-Stiefel, Dai-Yuan, Polack-Ribiere-Polyak, Liu-Storey, hybrid Dai-Yuan, Gilbert-Nocedal etc. A set of 75 unconstrained optimization problems with 10 different dimensions is being used (Andrei, Adv Model Optim 10: 147-161, 2008).

  • 出版日期2010-5