摘要

Support vector machine (SVM) has become one of the most popular methods in machine learning during the last years. The parameters' selection in SVM is an important step in achieving a high performance learning machine. Some methods are proposed by minimizing an estimate of generalization error based on bound of leave-one-out (LOO) bound, empirical error, etc. These methods have to optimize many quadratic programming problems and compute an inversion of the Gram-Schmidt matrix, which cause to be time-consuming in large-scale problems. This paper introduces a fast incremental method to optimize the kernel parameters in SVM by combining a geometric algorithm on SVM and an approximation of the gradient of the empirical error. This method shows an online way to update the kernel parameters and work set in incremental learning, which reduces the resources required both CPU time and storage space. The numerical tests on some benchmarks confirm our method.

全文