摘要

We propose a first-order smoothed penalty algorithm (SPA) to solve the sparse recovery problem min{parallel to x parallel to(1) : Ax = b}. SPA is efficient as long as the matrix-vector product Ax and A(T) y can be computed efficiently; in particular, A need not have orthogonal rows. SPA converges to the target signal by solving a sequence of penalized optimization subproblems, and each subproblem is solved using Nesterov's optimal algorithm for simple sets [Yu. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, Kluwer Academic Publishers, Norwell, MA, 2004] and [Yu. Nesterov, Math. Program., 103 (2005), pp. 127-152]. We show that the SPA iterates x(k) are epsilon-feasible; i.e. parallel to Ax(k) - b parallel to(2) <= epsilon and epsilon-optimal; i.e. vertical bar parallel to x(psi)parallel to 1 vertical bar <= epsilon after (O) over tilde (epsilon(-3/2)) iterations. SPA is able to work with l(1), l(2), or l(infinity) penalty on the infeasibility, and SPA can be easily extended to solve the relaxed recovery problem min{parallel to x parallel to(1) : parallel to Ax - b parallel to(2) <= delta}.

  • 出版日期2011