摘要

In this paper, a novel root finding problem for the Lagrangian support vector regression in 2-norm (LSVR) is formulated in which the number of unknowns becomes the number of training examples. Further, it is proposed to solve it by functional iterative and Newton methods. Under sufficient conditions, we proved their linear rate of convergence. Experiments are performed on a number of synthetic and real-world benchmark datasets, and their results are compared with support vector regression (SVR) and its variants such as least squares SVR and LSVR. Similar generalization performance with improved or comparable learning speed to SVR and its variants demonstrates the usefulness of the proposed formulation solved by the iterative methods.

  • 出版日期2016-12