摘要

A nonlinear conjugate gradient algorithm which is a modification of the Dai and Yuan [A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optim. 10 (1999), pp 177-182] conjugate gradient algorithm satisfying a parameterized sufficient descent condition with a parameter delta(k) is proposed. The parameter delta(k) is computed by means of the conjugacy condition, thus an algorithm which is a positive multiplicative modification of the Hestenes and Stiefel [Methods of conjugate gradients for solving linear systems, J. Res. Nat. Bur. Standards Sec. B 48 (1952). pp. 409-436] algorithm is obtained. The algorithm can be viewed as an adaptive version of the Dai and Liao [New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim. 43 (2001). pp. 87-101] conjugate gradient algorithm. Close to our computational scheme is the conjugate gradient algorithm recently proposed by Hager and Zhang [A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16 (2005), pp. 170-192]. Computational results, for a set consisting of 750 unconstrained optimization test problems. show that this new conjugate gradient algorithm substantially outperforms the known conjugate gradient algorithms.

  • 出版日期2009-2