摘要

In this paper, we propose a new classifier termed as an improved nu-twin bounded support vector machine (I nu-TBSVM) which is motivated by nu-twin support vector machine (nu-TSVM). Similar to the nu-TSVM, I nu-TBSVM determines two nonparallel hyperplanes such that they are closer to their respective classes and are at least rho (+) or rho (-) distance away from the other class. The significant advantage of I nu-TBSVM over nu-TSVM is that I nu-TBSVM skillfully avoids the expensive matrix inverse operation when solving the dual problems. Therefore, the proposed classifier is more effective when dealing with large scale problem and has comparable generalization ability. I nu-TBSVM also implements structural risk minimization principle by introducing a regularization term into the objective function. More importantly, the kernel trick can be applied directly to the I nu-TBSVM for nonlinear case, so the nonlinear I nu-TBSVM is superior to the nonlinear nu-TSVM theoretically. In addition, we also prove that nu-SVM is the special case of I nu-TBSVM. The property of parameters in I nu-TBSVM is discussed and testified by two artificial experiments. Numerical experiments on twenty-two benchmarking datasets are performed to investigate the validity of our proposed algorithm in both linear case and nonlinear case. Experimental results show the effectiveness of our proposed algorithm.