摘要

Conjugate gradient methods constitute excellent neural network training methods which are characterized by their simplicity and their very low memory requirements. In this paper,we propose a new spectral conjugate gradient method which guarantees sufficient descent using any line search, avoiding thereby the usually inefficient restarts. Moreover, we establish the global convergence of our proposed method under some assumptions. Experimental results provide evidence that our proposed method is preferable and in general superior to the classical conjugate gradient methods in terms of efficiency and robustness.

  • 出版日期2012-2