A Fast Symmetric Alternating Direction Method of Multipliers

Author(s)

&

Abstract

In recent years, alternating direction method of multipliers (ADMM) and its variants are popular for the extensive use in image processing and statistical learning. A variant of ADMM: symmetric ADMM, which updates the Lagrange multiplier twice in one iteration, is always faster whenever it converges. In this paper, combined with Nesterov's accelerating strategy, an accelerated symmetric ADMM is proposed. We prove its $\mathcal{O}(\frac{1}{k^2})$ convergence rate under strongly convex condition. For the general situation, an accelerated method with a restart rule is proposed. Some preliminary numerical experiments show the efficiency of our algorithms.

About this article

Abstract View

  • 44586

Pdf View

  • 4299

DOI

10.4208/nmtma.OA-2018-0108