Volume 2, Issue 4
Subspace Methods for Nonlinear Optimization

Xin Liu, Zaiwen Wen & Ya-Xiang Yuan

CSIAM Trans. Appl. Math., 2 (2021), pp. 585-651.

Published online: 2021-11

Export citation
  • Abstract

Subspace techniques such as Krylov subspace methods have been well known and extensively used in numerical linear algebra. They are also ubiquitous and becoming indispensable tools in nonlinear optimization due to their ability to handle large scale problems. There are generally two types of principals: i) the decision variable is updated in a lower dimensional subspace; ii) the objective function or constraints are approximated in a certain smaller subspace of their domain. The key ingredients are the constructions of suitable subspaces and subproblems according to the specific structures of the variables and functions such that either the exact or inexact solutions of subproblems are readily available and the corresponding computational cost is significantly reduced. A few relevant techniques include but not limited to direct combinations, block coordinate descent, active sets, limited-memory, Anderson acceleration, subspace correction, sampling and sketching. This paper gives a comprehensive survey on the subspace methods and their recipes in unconstrained and constrained optimization, nonlinear least squares problem, sparse and low rank optimization, linear and nonlinear eigenvalue computation, semidefinite programming, stochastic optimization and etc. In order to provide helpful guidelines, we emphasize on high level concepts for the development and implementation of practical algorithms from the subspace framework.

  • AMS Subject Headings

65K05, 90C30

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{CSIAM-AM-2-585, author = {Xin Liu , Zaiwen Wen , and Yuan , Ya-Xiang}, title = {Subspace Methods for Nonlinear Optimization}, journal = {CSIAM Transactions on Applied Mathematics}, year = {2021}, volume = {2}, number = {4}, pages = {585--651}, abstract = {

Subspace techniques such as Krylov subspace methods have been well known and extensively used in numerical linear algebra. They are also ubiquitous and becoming indispensable tools in nonlinear optimization due to their ability to handle large scale problems. There are generally two types of principals: i) the decision variable is updated in a lower dimensional subspace; ii) the objective function or constraints are approximated in a certain smaller subspace of their domain. The key ingredients are the constructions of suitable subspaces and subproblems according to the specific structures of the variables and functions such that either the exact or inexact solutions of subproblems are readily available and the corresponding computational cost is significantly reduced. A few relevant techniques include but not limited to direct combinations, block coordinate descent, active sets, limited-memory, Anderson acceleration, subspace correction, sampling and sketching. This paper gives a comprehensive survey on the subspace methods and their recipes in unconstrained and constrained optimization, nonlinear least squares problem, sparse and low rank optimization, linear and nonlinear eigenvalue computation, semidefinite programming, stochastic optimization and etc. In order to provide helpful guidelines, we emphasize on high level concepts for the development and implementation of practical algorithms from the subspace framework.

}, issn = {2708-0579}, doi = {https://doi.org/10.4208/csiam-am.SO-2021-0016}, url = {http://global-sci.org/intro/article_detail/csiam-am/19986.html} }
TY - JOUR T1 - Subspace Methods for Nonlinear Optimization AU - Xin Liu , AU - Zaiwen Wen , AU - Yuan , Ya-Xiang JO - CSIAM Transactions on Applied Mathematics VL - 4 SP - 585 EP - 651 PY - 2021 DA - 2021/11 SN - 2 DO - http://doi.org/10.4208/csiam-am.SO-2021-0016 UR - https://global-sci.org/intro/article_detail/csiam-am/19986.html KW - Nonlinear optimization, subspace techniques, block coordinate descent, active sets, limited memory, Anderson acceleration, subspace correction, subsampling, sketching. AB -

Subspace techniques such as Krylov subspace methods have been well known and extensively used in numerical linear algebra. They are also ubiquitous and becoming indispensable tools in nonlinear optimization due to their ability to handle large scale problems. There are generally two types of principals: i) the decision variable is updated in a lower dimensional subspace; ii) the objective function or constraints are approximated in a certain smaller subspace of their domain. The key ingredients are the constructions of suitable subspaces and subproblems according to the specific structures of the variables and functions such that either the exact or inexact solutions of subproblems are readily available and the corresponding computational cost is significantly reduced. A few relevant techniques include but not limited to direct combinations, block coordinate descent, active sets, limited-memory, Anderson acceleration, subspace correction, sampling and sketching. This paper gives a comprehensive survey on the subspace methods and their recipes in unconstrained and constrained optimization, nonlinear least squares problem, sparse and low rank optimization, linear and nonlinear eigenvalue computation, semidefinite programming, stochastic optimization and etc. In order to provide helpful guidelines, we emphasize on high level concepts for the development and implementation of practical algorithms from the subspace framework.

Xin Liu, Zaiwen Wen & Ya-Xiang Yuan. (2021). Subspace Methods for Nonlinear Optimization. CSIAM Transactions on Applied Mathematics. 2 (4). 585-651. doi:10.4208/csiam-am.SO-2021-0016
Copy to clipboard
The citation has been copied to your clipboard