摘要

Mathematical programming is a rich and well-advanced area in operations research. However, there are still many challenging problems in mathematical programming, and the large-scale optimization problem is one of them. In this article, a modified Polak-Ribiere-Polyak conjugate gradient algorithm that incorporates a non-monotone line search technique is presented. This method possesses not only gradient value information but also function value information. Moreover, the sufficient descent condition holds without any line search. Under suitable conditions, the global convergence is established for non-convex functions. Numerical results show that the proposed method is competitive with other conjugate gradient methods for large-scale optimization problems.