摘要

We propose a novel and fast algorithm to train support vector machines (SVMs) in primal space, which solves an approximate optimization of SVMs with the properties of unconstraint, continuity and twice differentiability by utilizing the Newton optimization technique. Further, we devise a special pre-extracting procedure to speed up the convergence of the algorithm by resorting to a high-quality initial solution. Theoretical studies show that the proposed algorithm produces an e-approximate solution to standard SVMs and maintains low computational complexity. Experimental results on benchmark data sets demonstrate that our algorithm is much faster than the dual based method such as SVMlight while it achieves the similar test accuracy.