摘要

In this paper, we propose a novel nonparallel classifier termed as projection nonparallel support vector machine (PNPSVM), which is fully different from the existing nonparallel classifiers. The new classifier needs two steps to obtain the optimal proximal hyperplanes. The first step is to obtain two projection directions, which could achieve maximum class separability by minimizing the within-class distance and maximizing the between-class distance simultaneously, to be treated as the normal vectors of the optimal proximal hyperplanes, and the second step is to determine the specific locations of the optimal proximal hyperplanes based on an appropriate central sample. Furthermore, the improved successive overrelaxation (SOR) algorithm is applied to solve our PNPSVM. The incomparable advantages of this paper can be summarized as follows: (1) implementing the structural risk minimization principle in the primal problems; (2) utilizing the potential structural information of data by considering both the tightness between the similar patterns and the discrepancy between the dissimilar pairs; (3) the kernel trick can be applied directly since the dual problems have the similar elegant formulation as that of standard support vector machine; (4) SOR technique is introduced to solve our optimization problems. The comprehensive experimental results on an artificial dataset and twenty-four UCI datasets demonstrate the effectiveness of our method in classification accuracy.