摘要

Multiple Kernel Learning (MKL) has been demonstrated to improve classification performance effectively. But it will cause a large complexity in some large-scale cases. In this paper, we aim to reduce both the time and space complexities of MKL, and thus propose an efficient multi-kernel classification machine based on the Nystrom approximation. Firstly, we generate different kernel matrices K(p)s for given data. Secondly, we apply the Nystrom approximation technique into each K-p so as to obtain its corresponding approximation matrix (K) over tilde(p). Thirdly, we fuse multiple generated (K) over tilde(p)s into the final ensemble matrix (G) over tilde with one certain heuristic rule. Finally, we select the Kernelized Modification of Ho-Kashyap algorithm with Squared approximation of the misclassification errors (KMHKS) as the incorporated paradigm, and meanwhile apply the (G) over tilde into KMHKS. In doing so, we propose a multi-kernel classification machine with reduced complexity named Nystrom approximation matrix with Multiple KMHKSs (NMKMHKS). The experimental results here validate both the effectiveness and efficiency of the proposed NMKMHKS. The contributions of NMKMHKS are that: (1) compared with the existing MKL, NMKMHKS reduces the computational complexity of finding the solution scale from O(Mn-3) to O(Mnm(2)), where M is the number of kernels, n is the number of training samples, and m is the number of the selected columns from K-p. Meanwhile, NMKMHKS reduces the space complexity of storing the kernel matrices from O(Mn-2) to O(n(2)); (2) compared with the original KMHKS, NMKMHKS improves the classification performance but keeps a comparable space complexity; (3) the better recognition of NMKMHKS can be got in a strong correlation between multiple used K(p)s; and (4) NMKMHKS has a tighter generalization risk bound in terms of the Rademacher complexity analysis.