A feature selection method for nonparallel plane support vector machine classification

作者:Ye, Qiaolin*; Zhao, Chunxia; Ye, Ning; Zheng, Hao; Chen, Xiaobo
来源:Optimization Methods and Software, 2012, 27(3): 431-443.
DOI:10.1080/10556788.2010.526608

摘要

Over the past decades, 1-norm techniques based on algorithms are widely used to suppress input features. Quite different from traditional 1-norm support vector machine (SVM), direct 1-norm optimization based on the primal problem of nonparallel plane classifiers like generalized proximal support vector machine, twin support vector machine (TWSVM) and least squares twin support vector machine (LSTSVM) are not capable of generating very sparse solutions that are vital for classification and can make them easier to store and faster to compute. To address the issue, in this paper, we develop a feature selection method for LSTSVM, called a feature selection method for nonparallel plane support vector machine classification (FLSTSVM), which is specially designed for strong feature suppression. We incorporate a Tikhonov regularization term to the objective of LSTSVM, and then minimize its 1-norm measure. Solution of FLSTSVM can follow directly from solving two smaller quadratic programming problems (QPPs) arising from two primal QPPs as opposed to two dual ones in TWSVM. FLSTSVM is capable of generating very sparse solutions. This means that FLSTSVM can reduce input features, for the linear case. When a nonlinear classifier is used, few kernel functions determine the classifier. In addition to having strong feature suppression, the edge of our method still lies in its faster computing time compared to that of TWSVM, Newton Method for Linear Programming SVM (NLPSVM) and LPNewton. Lastly, this algorithm is compared on public data sets, as well as an Exclusive Or (XOR) example.

全文