Discrete Approximation Scheme in Distributionally Robust Optimization

Discrete Approximation Scheme in Distributionally Robust Optimization

Year:    2021

Author:    Jin Zhang, Yongchao Liu, Xiaoming Yuan, Jin Zhang

Numerical Mathematics: Theory, Methods and Applications, Vol. 14 (2021), Iss. 2 : pp. 285–320

Abstract

Discrete approximation, which has been the prevailing scheme in stochastic programming in the past decade, has been extended to distributionally robust optimization (DRO) recently. In this paper, we conduct rigorous quantitative stability analysis of discrete approximation schemes for DRO, which measures the approximation error in terms of discretization sample size. For the ambiguity set defined through equality and inequality moment conditions, we quantify the discrepancy between the discretized ambiguity sets and the original set with respect to the Wasserstein metric. To establish the quantitative convergence, we develop a Hoffman error bound theory with Hoffman constant calculation criteria in a infinite dimensional space, which can be regarded as a byproduct of independent interest. For the ambiguity set defined by Wasserstein ball and moment conditions combined with Wasserstein ball, we present similar quantitative stability analysis by taking full advantage of the convex property inherently admitted by Wasserstein metric. Efficient numerical methods for specifically solving discrete approximation DRO problems with thousands of samples are also designed. In particular, we reformulate different types of discrete approximation problems into a class of saddle point problems with completely separable structures. The stochastic primal-dual hybrid gradient (PDHG) algorithm where in each iteration we update a random subset of the sampled variables is then amenable as a solution method for the reformulated saddle point problems. Some preliminary numerical tests are reported.

You do not have full access to this article.

Already a Subscriber? Sign in as an individual or via your institution

Journal Article Details

Publisher Name:    Global Science Press

Language:    English

DOI:    https://doi.org/10.4208/nmtma.OA-2020-0125

Numerical Mathematics: Theory, Methods and Applications, Vol. 14 (2021), Iss. 2 : pp. 285–320

Published online:    2021-01

AMS Subject Headings:   

Copyright:    COPYRIGHT: © Global Science Press

Pages:    36

Keywords:    Quantitative stability analysis Hoffman lemma discrete approximation method distributionally robust optimization stochastic primal-dual hybrid gradient.

Author Details

Jin Zhang

Yongchao Liu

Xiaoming Yuan

Jin Zhang

  1. Discretization and quantification for distributionally robust optimization with decision-dependent ambiguity sets

    Li, Manlan | Tong, Xiaojiao | Sun, Hailin

    Optimization Methods and Software, Vol. (2024), Iss. P.1

    https://doi.org/10.1080/10556788.2024.2401975 [Citations: 0]
  2. Partition-based distributionally robust optimization via optimal transport with order cone constraints

    Esteban-Pérez, Adrián | Morales, Juan M.

    4OR, Vol. 20 (2022), Iss. 3 P.465

    https://doi.org/10.1007/s10288-021-00484-z [Citations: 3]
  3. Mean robust optimization

    Wang, Irina | Becker, Cole | Van Parys, Bart | Stellato, Bartolomeo

    Mathematical Programming, Vol. (2024), Iss.

    https://doi.org/10.1007/s10107-024-02170-4 [Citations: 0]