摘要

In the rough margin-based v-twin support vector machine (rough v-TSVM), different penalties are given to the corresponding misclassified samples according their positions when constructing the separating hyperplane, which greatly improved the testing accuracy. However, it involves an expensive matrix inverse operation when solving the dual problem. In this paper, we propose an improved rough margin based v-twin bounded support vector machine (I rough v-TBSVM) which is motivated by the rough v-TSVM. Similarly, the proposed I rough v-TBSVM gives different penalties according to the samples' positions. Besides, it implements structural risk minimization principle by introducing a regularization term. So the I rough v-TBSVM yields higher testing accuracy in comparison with rough v-TSVM. It is worthwhile to mention that the proposed I rough v-TBSVM skillfully avoids the matrix inverse operation, which reduces the computational complexity and saves more running time. In addition, the kernel trick can be applied directly to the I rough v-TBSVM for the nonlinear case, which is essential to obtain the better classification performance. It is more flexible and has better generalization performance. Numerical experiments on thirty-five benchmark datasets are performed to investigate the validity of our proposed algorithm. Experimental results indicate that our algorithm gains better performance than the compared algorithms.