Linearly Convergent First-Order Algorithms for Semidefinite Programming

Author(s)

,
&

Abstract

In this paper, we consider two different formulations (one is smooth and the other one is nonsmooth) for solving linear matrix inequalities (LMIs), an important class of semidefinite programming (SDP), under a certain Slater constraint qualification assumption. We then propose two first-order methods, one based on subgradient method and the other based on Nesterov's optimal method, and show that they converge linearly for solving these formulations. Moreover, we introduce an accelerated prox-level method which converges linearly uniformly for both smooth and non-smooth problems without requiring the input of any problem parameters. Finally, we consider a special case of LMIs, i.e., linear system of inequalities, and show that a linearly convergent algorithm can be obtained under a much weaker assumption.

About this article

Abstract View

  • 37854

Pdf View

  • 2892

DOI

10.4208/jcm.1612-m2016-0703

How to Cite

Linearly Convergent First-Order Algorithms for Semidefinite Programming. (2019). Journal of Computational Mathematics, 35(4), 452-468. https://doi.org/10.4208/jcm.1612-m2016-0703