摘要

Generating a low-rank matrix approximation is very important in large-scale machine learning applications. The standard Nystrom method is one of the state-of-the-art techniques to generate such an approximation. It has got rapid developments since being applied to Gaussian process regression. Several enhanced Nystrom methods such as ensemble Nystrom, modified Nystrom and SS-Nystrom have been proposed. In addition, many sampling methods have been developed. In this paper, we review the Nystrom methods for large-scale machine learning. First, we introduce various Nystrom methods. Second, we review different sampling methods for the Nystrom methods and summarize them from the perspectives of both theoretical analysis and practical performance. Then, we list several typical machine learning applications that utilize the Nystrom methods. Finally, we make our conclusions after discussing some open machine learning problems related to Nystrom methods.