摘要

The Barzilai-Borwein conjugate gradient methods, which were first proposed by Dai and Kou (Sci China Math 59(8):1511-1524, 2016), are very interesting and very efficient for strictly convex quadratic minimization. In this paper, we present an efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization. Motivated by the Barzilai-Borwein method and the linear conjugate gradient method, we derive a new search direction satisfying the sufficient descent condition based on a quadratic model in a two-dimensional subspace, and design a new strategy for the choice of initial stepsize. A generalized Wolfe line search is also proposed, which is nonmonotone and can avoid a numerical drawback of the original Wolfe line search. Under mild conditions, we establish the global convergence and the R-linear convergence of the proposed method. In particular, we also analyze the convergence for convex functions. Numerical results show that, for the CUTEr library and the test problem collection given by Andrei, the proposed method is superior to two famous conjugate gradient methods, which were proposed by Dai and Kou (SIAM J Optim 23(1):296-320, 2013) and Hager and Zhang (SIAM J Optim 16(1):170-192, 2005), respectively.