摘要

This article describes a new Riemannian conjugate gradient method and presents a global convergence analysis. The existing Fletcher-Reeves-type Riemannian conjugate gradient method is guaranteed to be globally convergent if it is implemented with the strong Wolfe conditions. On the other hand, the Dai-Yuan-type Euclidean conjugate gradient method generates globally convergent sequences under the weak Wolfe conditions. This article deals with a generalization of Dai-Yuan's Euclidean algorithm to a Riemannian algorithm that requires only the weak Wolfe conditions. The global convergence property of the proposed method is proved by means of the scaled vector transport associated with the differentiated retraction. The results of numerical experiments demonstrate the effectiveness of the proposed algorithm.

  • 出版日期2016-5