摘要

The nonnegative tensor (matrix) factorization finds more and more applications in various disciplines including machine learning, data mining, and blind source separation, etc. In computation, the optimization problem involved is solved by alternatively minimizing one factor while the others are fixed. To solve the subproblem efficiently, we first exploit a variable regularization term which makes the subproblem far from ill-condition. Second, an augmented Lagrangian alternating direction method is employed to solve this convex and well-conditioned regularized subproblem, and two accelerating skills are also implemented. Some preliminary numerical experiments are performed to show the improvements of the new method.

全文