摘要

In recent years, matrix rank minimization problems have received a significant amount of attention in machine learning, data mining and computer vision communities. And these problems can be solved by a convex relaxation of the rank minimization problem which minimizes the nuclear norm instead of the rank of the matrix, and has to be solved iteratively and involves singular value decomposition (SVD) at each iteration. Therefore, those algorithms for nuclear norm minimization problems suffer from high computation cost of multiple SVDs. In this paper, we propose a Fast Tri-Factorization (FTF) method to approximate the nuclear norm minimization problem and mitigate the computation cost of performing SVDs. The proposed FTF method can be used to reliably solve a wide range of low-rank matrix recovery and completion problems such as robust principal component analysis (RPCA), low-rank representation (LRR) and low-rank matrix completion (MC). We also present three specific models for RPCA, LRR and MC problems, respectively. Moreover, we develop two alternating direction method (ADM) based iterative algorithms for solving the above three problems. Experimental results on a variety of synthetic and real-world data sets validate the efficiency, robustness and effectiveness of our FTF method comparing with the state-of-the-art nuclear norm minimization algorithms.