Year: 2022
Author: Jianchao Bai, Deren Han, Hao Sun, Hongchao Zhang
CSIAM Transactions on Applied Mathematics, Vol. 3 (2022), Iss. 3 : pp. 448–479
Abstract
In this paper, we develop a symmetric accelerated stochastic Alternating Direction Method of Multipliers (SAS-ADMM) for solving separable convex optimization problems with linear constraints. The objective function is the sum of a possibly nonsmooth convex function and an average function of many smooth convex functions. Our proposed algorithm combines both ideas of ADMM and the techniques of accelerated stochastic gradient methods possibly with variance reduction to solve the smooth subproblem. One main feature of SAS-ADMM is that its dual variable is symmetrically updated after each update of the separated primal variable, which would allow a more flexible and larger convergence region of the dual variable compared with that of standard deterministic or stochastic ADMM. This new stochastic optimization algorithm is shown to have ergodic converge in expectation with $\mathcal{O}(1/T)$ convergence rate, where $T$ denotes the number of outer iterations. Our preliminary experiments indicate the proposed algorithm is very effective for solving separable optimization problems from big-data applications. Finally, 3-block extensions of the algorithm and its variant of an accelerated stochastic augmented Lagrangian method are discussed in the appendix.
You do not have full access to this article.
Already a Subscriber? Sign in as an individual or via your institution
Journal Article Details
Publisher Name: Global Science Press
Language: English
DOI: https://doi.org/10.4208/csiam-am.SO-2021-0021
CSIAM Transactions on Applied Mathematics, Vol. 3 (2022), Iss. 3 : pp. 448–479
Published online: 2022-01
AMS Subject Headings: Global Science Press
Copyright: COPYRIGHT: © Global Science Press
Pages: 32
Keywords: Convex optimization stochastic ADMM symmetric ADMM larger stepsize proximal mapping complexity.