摘要

Non negative Matrix Factorization (NMF) has become one of the most popular models in data mining for its good performance in unsupervised learning applications. Recently, a variety of divergence functions have been extensively studied for NMF. But, there is still lack of analysis on the relationships between the divergence functions and the applications. This article tries to give some preliminary results on this interesting problem. Our experiments show that the most familiar two divergence functions-the least squares error and the K-L divergence-are competent for unsupervised learning such as gene expression data clustering and image processing.

全文