摘要

Multivariate spectral gradient method is proposed for solving unconstrained optimization problems. Combined with some quasi-Newton property multivariate spectral gradient method allows an individual adaptive stepsize along each coordinate direction, which guarantees that the method is finitely convergent for positive definite quadratics. Especially, it converges no more than two steps for positive definite quadratics with diagonal Hessian, and quadratically for objective functions with positive definite diagonal Hessian. Moreover, based on a nonmonotone line search, global convergence is established for multivariate spectral gradient algorithm. At last numerical results are reported, which show that this method is promising and deserves further discussing.