摘要

This paper introduces a novel sparse nonparametric support vector machine classifier (SN-SVM) which combines data distribution information from two state-of-the-art kernel-based classifiers, namely, the kernel support vector machine (KSVM) and the kernel nonparametric discriminant (KND). The proposed model incorporates some near-global variations of the data provided by the KND and, hence, may be viewed as an extension to the KSVM. Similarly, since the support vectors improve the choice of -nearest neighbors (%26apos;s), it can also serve as an extension to the KND. The proposed model is capable of dealing with both heteroscedastic and non-normal data while avoiding the small sample size problem. The model is a convex quadratic optimization problem with one global optimal solution, so it can be estimated easily and efficiently using numerical methods. It can also be reduced to the classical KSVM model and as such existing SVM programs can be used for easy implementation. Through the Bayesian interpretation with the help of a Gaussian prior, we show that our method provides a sparse solution by assigning non-zero weights to only a fraction of the total number of training samples. This sparsity can be used by existing sparse classification algorithms to obtain better computational efficiency. The experimental results on real-world datasets and face recognition applications show that the proposed SN-SVM model improves the classification accuracy over contemporary classifiers and also provides sparser solution than the KSVM.

  • 出版日期2014-11