摘要

In this paper, we propose and analyze an incremental subgradient method with a diminishing stepsize rule for a convex optimization problem on a Riemannian manifold, where the object function consisted of the sum of a large number of component functions. This type of function is useful in lots of fields. We establish an important inequality about the sequence generated by the method, provided the sectional curvature of the manifold is nonnegative. Using the inequality, we prove Proposition 3.1, and then we obtain some convergence results of the incremental subgradient method.