摘要

Hilbert-Schmidt independence criterion (HSIC) is typically used to measure the statistical dependence between two sets of data. HSIC first transforms these two sets of data into two reproducing Kernel Hilbert spaces (RKHS), respectively, and then measures the statistical dependence between them using the Hilbert-Schmidt (HS) operator. This paper proposes a dimension reduction method that is based on HSIC maximization between the high dimensional data and dimension-reduced data, and it is denoted as HSIC-NDR. In the proposed method, the linear kernel is chosen as the kernel function of the RKHS of the low dimensional data after reduction, due to the reason that it can express dimensionality reduction data explicitly from the kernel matrix, thus facilitating the construction of the objective function of the data dimension reduction algorithm. And the kernel function of the RKHS of the original data set can be appropriately chosen according to the specific application. Therefore, the dimension reduction algorithm proposed in this paper can be widely applicable. The experiments are conducted in ten commonly used synthetic and real data sets in the machine learning area. And five representative data dimension reduction algorithms with different properties (linear, nonlinear global, nonlinear local, and nonlinear global + local) are used in the experiment for comparison. The experimental results show that the HSIC-NDR algorithm outperforms those representative algorithms without increasing computational complexity. The proposed HSIC-NDR algorithm and those representative algorithms are all attributed to Rayleigh's calculations.