摘要

Structured output prediction is an important machine learning problem both in theory and practice, and the max-margin Markov network ((MN)-N-3) is an effective approach. All state-of-the-art algorithms for optimizing (MN)-N-3 objectives take at least O(1/epsilon) number of iterations to find an epsilon accurate solution. Nesterov [1] broke this barrier by proposing an excessive gap reduction technique (EGR) which converges in O(1/root epsilon) iterations. However, it is restricted to Euclidean projections which consequently requires an intractable amount of computation for each iteration when applied to solve (MN)-N-3. In this paper, we show that by extending EGR to Bregman projection, this faster rate of convergence can be retained, and more importantly, the updates can be performed efficiently by exploiting graphical model factorization. Further, we design a kernelized procedure which allows all computations per iteration to be performed at the same cost as the state-of-the-art approaches.

  • 出版日期2014-1-30