摘要

Although the study of global convergence of the Polak-RibiSre-Polyak (PRP), Hestenes-Stiefel (HS) and Liu-Storey (LS) conjugate gradient methods has made great progress, the convergence of these algorithms for general nonlinear functions is still erratic, not to mention under weak conditions on the objective function and weak line search rules. Besides, it is also interesting to investigate whether there exists a general method that converges under the standard Armijo line search for general nonconvex functions, since very few relevant results have been achieved. So in this paper, we present a new general form of conjugate gradient methods whose theoretical significance is attractive. With any formula beta (k) a parts per thousand yenaEuro parts per thousand 0 and under weak conditions, the proposed method satisfies the sufficient descent condition independently of the line search used and the function convexity, and its global convergence can be achieved under the standard Wolfe line search or even under the standard Armijo line search. Based on this new method, convergence results on the PRP, HS, LS, Dai-Yuan-type (DY) and Conjugate-Descent-type (CD) methods are established. Preliminary numerical results show the efficiency of the proposed methods.

全文