摘要

We study preconditioned algorithms of alternating direction method of multipliers type for nonsmooth optimization problems. The alternating direction method of multipliers is a popular first-order method for general constrained optimization problems. However, one of its drawbacks is the need to solve implicit subproblems. In various applications, these subproblems are either easily solvable or linear, but nevertheless challenging. We derive a preconditioned version that allows for flexible and efficient preconditioning for these linear subproblems. The original and preconditioned version is written as a new kind of proximal point method for the primal problem, and the weak (strong) convergence in infinite (finite) dimensional Hilbert spaces is proved. Various efficient preconditioners with any number of inner iterations may be used in this preconditioned framework. Furthermore, connections between the preconditioned version and the recently introduced preconditioned Douglas-Rachford method for general nonsmooth problems involving quadratic-linear terms are established. The methods are applied to total variation denoising problems, and their benefits are shown in numerical experiments.