摘要

Kernel is a key component of the Support vector machines (SVMs) and other kernel methods. Based on the data distributions of classes in feature space, we proposed a kernel selection criterion named Kernel distance-based class separability (KDCS) to evaluate the goodness of a kernel in multiclass classification scenario. KDCS is differentiable with respect to the kernel parameters, thus the gradient-based optimization technique can be used to find the best model efficiently. In addition, it does not need to put a part of training samples aside for validation and makes full use of all the training samples available. The relationship between this criterion and kernel polarization was also explored. Compared with the 10-fold cross validation technique which is often regarded as a benchmark, this criterion is found to yield about the same performance as exhaustive parameter search.