摘要

To improve the training speed of online Least squares Support vector machines (LS-SVM) for large scale problems, a novel training algorithm based on Sequential minimal optimization (SMO) is proposed in this paper. First we presented the SMO-based incremental and decremental learning algorithm, which can efficiently get new solutions based on previous training results when new samples being added or less important samples being removed. Then we proposed the online LS-SVM based on the incremental and decremental learning algorithm. This online LS-SVM no only has very high training speed and high classification accuracy but also can adaptively get sparse solutions according to objective classification problems. Finally several numerical experiments show the effectiveness of the proposed algorithm.