Dropout non-negative matrix factorization

作者:He, Zhicheng; Liu, Jie*; Liu, Caihua; Wang, Yuan; Yin, Airu; Huang, Yalou
来源:Knowledge and Information Systems, 2019, 60(2): 781-806.
DOI:10.1007/s10115-018-1259-x

摘要

Non-negative matrix factorization (NMF) has received lots of attention in research communities like document clustering, image analysis, and collaborative filtering. However, NMF-based approaches often suffer from overfitting and interdependent features which are caused by latent feature co-adaptation during the learning process. Most of the existing improved methods of NMF take advantage of side information or task-specific knowledge. However, they are not always available. Dropout has been widely recognized as a powerful strategy for preventing co-adaptation in deep neural network training. What is more, it requires no prior knowledge and brings no additional terms or transformations into the original loss function. In this paper, we introduce the dropout strategy into NMF and propose a dropout NMF algorithm. Specifically, we first design a simple dropout strategy that fuses a dropout mask in the NMF framework to prevent feature co-adaptation. Then a sequential dropout strategy is further proposed to reduce randomness and to achieve robustness. Experimental results on multiple datasets confirm that our dropout NMF methods can not only improve NMF but also further improve existing representative matrix factorization models.