摘要

In this paper, we consider a class of ForwardBackward (FB) splitting methods that includes several variants (e.g., inertial schemes, FISTA) for minimizing the sum of two proper convex and lower semicontinuous functions, one of which has a Lipschitz continuous gradient, and the other is partly smooth relative to a smooth active manifold M. We propose a unified framework, under which we show that this class of FB-type algorithms (i) correctly identifies the active manifold in a finite number of iterations (finite activity identification), and (ii) then enters a local linear convergence regime, which we characterize precisely in terms of the structure of the underlying active manifold. We also establish and explain why FISTA (with convergent sequences) locally oscillates and can be locally slower than FB. These results may have numerous applications including in signal/image processing, sparse recovery and machine learning. Indeed, the obtained results explain the typical behaviour that has been observed numerically for many problems in these fields such as the Lasso, the group Lasso, the fused Lasso and the nuclear norm minimization to name only a few.

  • 出版日期2017