Sparse Principal Component Analysis via Rotation and Truncation

作者:Hu, Zhenfang; Pan, Gang; Wang, Yueming*; Wu, Zhaohui
来源:IEEE Transactions on Neural Networks and Learning Systems, 2016, 27(4): 875-890.
DOI:10.1109/TNNLS.2015.2427451

摘要

Sparse principal component analysis (sparse PCA) aims at finding a sparse basis to improve the interpretability over the dense basis of PCA, while still covering the data subspace as much as possible. In contrast to most existing work that addresses the problem by adding sparsity penalties on various objectives of PCA, we propose a new method, sparse PCA via rotation and truncation (SPCArt), which finds a rotation matrix and a sparse basis such that the sparse basis approximates the basis of PCA after the rotation. The algorithm of SPCArt consists of three alternating steps: 1) rotating the PCA basis; 2) truncating small entries; and 3) updating the rotation matrix. Its performance bounds are also given. The SPCArt is efficient, with each iteration scaling linearly with the data dimension. Parameter choice is simple, due to explicit physical explanations. We give a unified view to several existing sparse PCA methods and discuss the connections with SPCArt. Some ideas from SPCArt are extended to GPower, a popular sparse PCA algorithm, to address its limitations. Experimental results demonstrate that SPCArt achieves the state-of-the-art performance, along with a good tradeoff among various criteria, including sparsity, explained variance, orthogonality, balance of sparsity among loadings, and computational speed.