Decomposable norm minimization with proximal-gradient homotopy algorithm

作者:Eghbali Reza*; Fazel Maryam
来源:Computational Optimization and Applications, 2017, 66(2): 345-381.
DOI:10.1007/s10589-016-9871-8

摘要

We study the convergence rate of the proximal-gradient homotopy algorithm applied to norm-regularized linear least squares problems, for a general class of norms. The homotopy algorithm reduces the regularization parameter in a series of steps, and uses a proximal-gradient algorithm to solve the problem at each step. Proximal-gradient algorithm has a linear rate of convergence given that the objective function is strongly convex, and the gradient of the smooth component of the objective function is Lipschitz continuous. In many applications, the objective function in this type of problem is not strongly convex, especially when the problem is high-dimensional and regularizers are chosen that induce sparsity or low-dimensionality. We show that if the linear sampling matrix satisfies certain assumptions and the regularizing norm is decomposable, proximal-gradient homotopy algorithm converges with a linear rate even though the objective function is not strongly convex. Our result generalizes results on the linear convergence of homotopy algorithm for -regularized least squares problems. Numerical experiments are presented that support the theoretical convergence rate analysis.

  • 出版日期2017-3