摘要

In many machine learning problems such as the dual form of SVM, the objective function to be minimized is convex but not strongly convex. This fact causes difficulties in obtaining the complexity of some commonly used optimization algorithms. In this paper, we proved the global linear convergence on a wide range of algorithms when they are applied to some non-strongly convex problems. In particular, we are the first to prove O(log(1/is an element of)) time complexity of cyclic coordinate descent methods on dual problems of support vector classification and regression.

  • 出版日期2014-4