摘要

This paper proposes a nonmonotone scaled conjugate gradient algorithm for solving large-scale unconstrained optimization problems, which combines the idea of scaled memoryless Broyden-Fletcher-Goldfarb-Shanno preconditioned conjugate gradient method with the nonmonotone technique. An attractive property of the proposed method is that the search direction always provides sufficient descent step at each iteration. This property is independent of the line search used. Under appropriate assumptions, the method is proven to possess global convergence for nonconvex smooth functions, and R-linear convergence for strongly convex functions. Preliminary numerical results and related comparisons show the efficiency of the proposed method in practical computation.