摘要

Based on the notion of duality in convex optimization, a novel online learning algorithmic framework with forgetting property is proposed. The Fenchel conjugate of hinge functions is a key to transfer the basic learning problem from batch to online. New online learning algorithms were derived by different dual ascending procedures: (1) gradient ascent; (2) greedy ascent. Earlier researches were reviewed. Detailed experiments on synthetic and real-world datasets verified the effectiveness of the approaches. An important conclusion is that our derived online learning algorithms can handle the settings where the target hypothesis is not fixed but drifts with the sequence of examples, which paves a way to the design and analysis of online learning algorithms.

  • 出版日期2014

全文