Volume 3, Issue 3
Convergence on a Symmetric Accelerated Stochastic ADMM with Larger Stepsizes

Jianchao Bai, Deren Han, Hao Sun & Hongchao Zhang

CSIAM Trans. Appl. Math., 3 (2022), pp. 448-479.

Published online: 2022-08

Export citation
  • Abstract

In this paper, we develop a symmetric accelerated stochastic Alternating Direction Method of Multipliers (SAS-ADMM) for solving separable convex optimization problems with linear constraints. The objective function is the sum of a possibly nonsmooth convex function and an average function of many smooth convex functions. Our proposed algorithm combines both ideas of ADMM and the techniques of accelerated stochastic gradient methods possibly with variance reduction to solve the smooth subproblem. One main feature of SAS-ADMM is that its dual variable is symmetrically updated after each update of the separated primal variable, which would allow a more flexible and larger convergence region of the dual variable compared with that of standard deterministic or stochastic ADMM. This new stochastic optimization algorithm is shown to have ergodic converge in expectation with $\mathcal{O}(1/T)$ convergence rate, where $T$ denotes the number of outer iterations. Our preliminary experiments indicate the proposed algorithm is very effective for solving separable optimization problems from big-data applications. Finally, 3-block extensions of the algorithm and its variant of an accelerated stochastic augmented Lagrangian method are discussed in the appendix.

  • AMS Subject Headings

65K10, 65Y20, 68W40, 90C25

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{CSIAM-AM-3-448, author = {Bai , JianchaoHan , DerenSun , Hao and Zhang , Hongchao}, title = {Convergence on a Symmetric Accelerated Stochastic ADMM with Larger Stepsizes}, journal = {CSIAM Transactions on Applied Mathematics}, year = {2022}, volume = {3}, number = {3}, pages = {448--479}, abstract = {

In this paper, we develop a symmetric accelerated stochastic Alternating Direction Method of Multipliers (SAS-ADMM) for solving separable convex optimization problems with linear constraints. The objective function is the sum of a possibly nonsmooth convex function and an average function of many smooth convex functions. Our proposed algorithm combines both ideas of ADMM and the techniques of accelerated stochastic gradient methods possibly with variance reduction to solve the smooth subproblem. One main feature of SAS-ADMM is that its dual variable is symmetrically updated after each update of the separated primal variable, which would allow a more flexible and larger convergence region of the dual variable compared with that of standard deterministic or stochastic ADMM. This new stochastic optimization algorithm is shown to have ergodic converge in expectation with $\mathcal{O}(1/T)$ convergence rate, where $T$ denotes the number of outer iterations. Our preliminary experiments indicate the proposed algorithm is very effective for solving separable optimization problems from big-data applications. Finally, 3-block extensions of the algorithm and its variant of an accelerated stochastic augmented Lagrangian method are discussed in the appendix.

}, issn = {2708-0579}, doi = {https://doi.org/10.4208/csiam-am.SO-2021-0021}, url = {http://global-sci.org/intro/article_detail/csiam-am/20969.html} }
TY - JOUR T1 - Convergence on a Symmetric Accelerated Stochastic ADMM with Larger Stepsizes AU - Bai , Jianchao AU - Han , Deren AU - Sun , Hao AU - Zhang , Hongchao JO - CSIAM Transactions on Applied Mathematics VL - 3 SP - 448 EP - 479 PY - 2022 DA - 2022/08 SN - 3 DO - http://doi.org/10.4208/csiam-am.SO-2021-0021 UR - https://global-sci.org/intro/article_detail/csiam-am/20969.html KW - Convex optimization, stochastic ADMM, symmetric ADMM, larger stepsize, proximal mapping, complexity. AB -

In this paper, we develop a symmetric accelerated stochastic Alternating Direction Method of Multipliers (SAS-ADMM) for solving separable convex optimization problems with linear constraints. The objective function is the sum of a possibly nonsmooth convex function and an average function of many smooth convex functions. Our proposed algorithm combines both ideas of ADMM and the techniques of accelerated stochastic gradient methods possibly with variance reduction to solve the smooth subproblem. One main feature of SAS-ADMM is that its dual variable is symmetrically updated after each update of the separated primal variable, which would allow a more flexible and larger convergence region of the dual variable compared with that of standard deterministic or stochastic ADMM. This new stochastic optimization algorithm is shown to have ergodic converge in expectation with $\mathcal{O}(1/T)$ convergence rate, where $T$ denotes the number of outer iterations. Our preliminary experiments indicate the proposed algorithm is very effective for solving separable optimization problems from big-data applications. Finally, 3-block extensions of the algorithm and its variant of an accelerated stochastic augmented Lagrangian method are discussed in the appendix.

Jianchao Bai, Deren Han, Hao Sun & Hongchao Zhang. (2022). Convergence on a Symmetric Accelerated Stochastic ADMM with Larger Stepsizes. CSIAM Transactions on Applied Mathematics. 3 (3). 448-479. doi:10.4208/csiam-am.SO-2021-0021
Copy to clipboard
The citation has been copied to your clipboard