Volume 3, Issue 4
Deep Ritz Methods for Laplace Equations with Dirichlet Boundary Condition

Chenguang Duan, Yuling Jiao, Yanming Lai, Xiliang Lu, Qimeng Quan & Jerry Zhijian Yang

CSIAM Trans. Appl. Math., 3 (2022), pp. 761-791.

Published online: 2022-11

Export citation
  • Abstract

Deep Ritz methods (DRM) have been proven numerically to be efficient in solving partial differential equations. In this paper, we present a convergence rate in $H^1$ norm for deep Ritz methods for Laplace equations with Dirichlet boundary condition, where the error depends on the depth and width in the deep neural networks and the number of samples explicitly. Further we can properly choose the depth and width in the deep neural networks in terms of the number of training samples. The main idea of the proof is to decompose the total error of DRM into three parts, that is approximation error, statistical error and the error caused by the boundary penalty. We bound the approximation error in $H^1$ norm with ${\rm ReLU}^2$ networks and control the statistical error via Rademacher complexity. In particular, we derive the bound on the Rademacher complexity of the non-Lipschitz composition of gradient norm with ${\rm ReLU}^2$ network, which is of immense independent interest. We also analyze the error inducing by the boundary penalty method and give a prior rule for tuning the penalty parameter.

  • AMS Subject Headings

62G05, 65N12, 65N15, 68T07

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{CSIAM-AM-3-761, author = {Duan , ChenguangJiao , YulingLai , YanmingLu , XiliangQuan , Qimeng and Yang , Jerry Zhijian}, title = {Deep Ritz Methods for Laplace Equations with Dirichlet Boundary Condition}, journal = {CSIAM Transactions on Applied Mathematics}, year = {2022}, volume = {3}, number = {4}, pages = {761--791}, abstract = {

Deep Ritz methods (DRM) have been proven numerically to be efficient in solving partial differential equations. In this paper, we present a convergence rate in $H^1$ norm for deep Ritz methods for Laplace equations with Dirichlet boundary condition, where the error depends on the depth and width in the deep neural networks and the number of samples explicitly. Further we can properly choose the depth and width in the deep neural networks in terms of the number of training samples. The main idea of the proof is to decompose the total error of DRM into three parts, that is approximation error, statistical error and the error caused by the boundary penalty. We bound the approximation error in $H^1$ norm with ${\rm ReLU}^2$ networks and control the statistical error via Rademacher complexity. In particular, we derive the bound on the Rademacher complexity of the non-Lipschitz composition of gradient norm with ${\rm ReLU}^2$ network, which is of immense independent interest. We also analyze the error inducing by the boundary penalty method and give a prior rule for tuning the penalty parameter.

}, issn = {2708-0579}, doi = {https://doi.org/10.4208/csiam-am.SO-2021-0043}, url = {http://global-sci.org/intro/article_detail/csiam-am/21155.html} }
TY - JOUR T1 - Deep Ritz Methods for Laplace Equations with Dirichlet Boundary Condition AU - Duan , Chenguang AU - Jiao , Yuling AU - Lai , Yanming AU - Lu , Xiliang AU - Quan , Qimeng AU - Yang , Jerry Zhijian JO - CSIAM Transactions on Applied Mathematics VL - 4 SP - 761 EP - 791 PY - 2022 DA - 2022/11 SN - 3 DO - http://doi.org/10.4208/csiam-am.SO-2021-0043 UR - https://global-sci.org/intro/article_detail/csiam-am/21155.html KW - Deep Ritz methods, convergence rate, Dirichlet boundary condition, approximation error, Rademacher complexity. AB -

Deep Ritz methods (DRM) have been proven numerically to be efficient in solving partial differential equations. In this paper, we present a convergence rate in $H^1$ norm for deep Ritz methods for Laplace equations with Dirichlet boundary condition, where the error depends on the depth and width in the deep neural networks and the number of samples explicitly. Further we can properly choose the depth and width in the deep neural networks in terms of the number of training samples. The main idea of the proof is to decompose the total error of DRM into three parts, that is approximation error, statistical error and the error caused by the boundary penalty. We bound the approximation error in $H^1$ norm with ${\rm ReLU}^2$ networks and control the statistical error via Rademacher complexity. In particular, we derive the bound on the Rademacher complexity of the non-Lipschitz composition of gradient norm with ${\rm ReLU}^2$ network, which is of immense independent interest. We also analyze the error inducing by the boundary penalty method and give a prior rule for tuning the penalty parameter.

Chenguang Duan, Yuling Jiao, Yanming Lai, Xiliang Lu, Qimeng Quan & Jerry Zhijian Yang. (2022). Deep Ritz Methods for Laplace Equations with Dirichlet Boundary Condition. CSIAM Transactions on Applied Mathematics. 3 (4). 761-791. doi:10.4208/csiam-am.SO-2021-0043
Copy to clipboard
The citation has been copied to your clipboard