摘要

A new family of conjugate gradient methods is proposed by minimizing the distance between two certain directions. It is a subfamily of Dai-Liao family, which consists of Hager-Zhang family and Dai-Kou method. The direction of the proposed method is an approximation to that of the memoryless Broyden-Fletcher- Goldfarb-Shanno method. With the suitable intervals of parameters, the direction of the proposed method possesses the sufficient descent property independent of the line search. Under mild assumptions, we analyze the global convergence of the method for strongly convex functions and general functions where the stepsize is obtained by the standard Wolfe rules. Numerical results indicate that the proposed method is a promising method which outperforms CGOPT and CG_DESCENT on a set of unconstrained optimization testing problems.