Convergence on a Symmetric Accelerated Stochastic ADMM with Larger Stepsizes

Convergence on a Symmetric Accelerated Stochastic ADMM with Larger Stepsizes

Year:    2022

Author:    Jianchao Bai, Deren Han, Hao Sun, Hongchao Zhang

CSIAM Transactions on Applied Mathematics, Vol. 3 (2022), Iss. 3 : pp. 448–479

Abstract

In this paper, we develop a symmetric accelerated stochastic Alternating Direction Method of Multipliers (SAS-ADMM) for solving separable convex optimization problems with linear constraints. The objective function is the sum of a possibly nonsmooth convex function and an average function of many smooth convex functions. Our proposed algorithm combines both ideas of ADMM and the techniques of accelerated stochastic gradient methods possibly with variance reduction to solve the smooth subproblem. One main feature of SAS-ADMM is that its dual variable is symmetrically updated after each update of the separated primal variable, which would allow a more flexible and larger convergence region of the dual variable compared with that of standard deterministic or stochastic ADMM. This new stochastic optimization algorithm is shown to have ergodic converge in expectation with $\mathcal{O}(1/T)$ convergence rate, where $T$ denotes the number of outer iterations. Our preliminary experiments indicate the proposed algorithm is very effective for solving separable optimization problems from big-data applications. Finally, 3-block extensions of the algorithm and its variant of an accelerated stochastic augmented Lagrangian method are discussed in the appendix.

You do not have full access to this article.

Already a Subscriber? Sign in as an individual or via your institution

Journal Article Details

Publisher Name:    Global Science Press

Language:    English

DOI:    https://doi.org/10.4208/csiam-am.SO-2021-0021

CSIAM Transactions on Applied Mathematics, Vol. 3 (2022), Iss. 3 : pp. 448–479

Published online:    2022-01

AMS Subject Headings:    Global Science Press

Copyright:    COPYRIGHT: © Global Science Press

Pages:    32

Keywords:    Convex optimization stochastic ADMM symmetric ADMM larger stepsize proximal mapping complexity.

Author Details

Jianchao Bai

Deren Han

Hao Sun

Hongchao Zhang

  1. An efficient regularized PR splitting type algorithm for two-block nonconvex linear constrained programs in ℓ1/2 regularized compressed sensing problems

    Chao, Miantao | Lu, Yongzi | Jian, Jinbao | Xu, Xiao

    Journal of Computational and Applied Mathematics, Vol. 453 (2025), Iss. P.116145

    https://doi.org/10.1016/j.cam.2024.116145 [Citations: 0]
  2. A proximal fully parallel splitting method with a relaxation factor for separable convex programming

    Yin, Jianghua | Jian, Jinbao | Jiang, Xianzhen | Wu, Jiansheng | Ma, Guodong

    Applied Numerical Mathematics, Vol. 195 (2024), Iss. P.17

    https://doi.org/10.1016/j.apnum.2023.09.003 [Citations: 1]
  3. Inexact generalized ADMM with relative error criteria for linearly constrained convex optimization problems

    Wu, Zhongming | Song, Ye | Jiang, Fan

    Optimization Letters, Vol. 18 (2024), Iss. 2 P.447

    https://doi.org/10.1007/s11590-023-01997-8 [Citations: 0]
  4. Reweighted Alternating Direction Method of Multipliers for DNN weight pruning

    Yuan, Ming | Du, Lin | Jiang, Feng | Bai, Jianchao | Chen, Guanrong

    Neural Networks, Vol. 179 (2024), Iss. P.106534

    https://doi.org/10.1016/j.neunet.2024.106534 [Citations: 0]
  5. An accelerated conjugate gradient method for the Z-eigenvalues of symmetric tensors

    Cao, Mingyuan | Yang, Yueting | Li, Chaoqian | Jiang, Xiaowei

    AIMS Mathematics, Vol. 8 (2023), Iss. 7 P.15008

    https://doi.org/10.3934/math.2023766 [Citations: 0]
  6. A systematic DNN weight pruning framework based on symmetric accelerated stochastic ADMM

    Yuan, Ming | Bai, Jianchao | Jiang, Feng | Du, Lin

    Neurocomputing, Vol. 575 (2024), Iss. P.127327

    https://doi.org/10.1016/j.neucom.2024.127327 [Citations: 1]
  7. An inexact ADMM with proximal-indefinite term and larger stepsize

    Ma, Yuxue | Bai, Jianchao | Sun, Hao

    Applied Numerical Mathematics, Vol. 184 (2023), Iss. P.542

    https://doi.org/10.1016/j.apnum.2022.10.015 [Citations: 4]
  8. Accelerated Stochastic Peaceman–Rachford Method for Empirical Risk Minimization

    Bai, Jian-Chao | Bian, Feng-Miao | Chang, Xiao-Kai | Du, Lin

    Journal of the Operations Research Society of China, Vol. 11 (2023), Iss. 4 P.783

    https://doi.org/10.1007/s40305-023-00470-8 [Citations: 1]
  9. A Stochastic Nesterov’s Smoothing Accelerated Method for General Nonsmooth Constrained Stochastic Composite Convex Optimization

    Wang, Ruyu | Zhang, Chao | Wang, Lichun | Shao, Yuanhai

    Journal of Scientific Computing, Vol. 93 (2022), Iss. 2

    https://doi.org/10.1007/s10915-022-02016-1 [Citations: 2]
  10. A mini-batch algorithm for large-scale learning problems with adaptive step size

    He, Chongyang | Zhang, Yiting | Zhu, Dingyu | Cao, Mingyuan | Yang, Yueting

    Digital Signal Processing, Vol. 143 (2023), Iss. P.104230

    https://doi.org/10.1016/j.dsp.2023.104230 [Citations: 2]
  11. Multi-step inertial strictly contractive PRSM algorithms for convex programming problems with applications

    Deng, Zhao | Han, Deren

    Journal of Computational and Applied Mathematics, Vol. 437 (2024), Iss. P.115469

    https://doi.org/10.1016/j.cam.2023.115469 [Citations: 1]
  12. A Gradient-Based Algorithm with Nonmonotone Line Search for Nonnegative Matrix Factorization

    Li, Wenbo | Shi, Xiaolu

    Symmetry, Vol. 16 (2024), Iss. 2 P.154

    https://doi.org/10.3390/sym16020154 [Citations: 0]