Volume 39, Issue 3
Improved Analysis of PINNs: Alleviate the CoD for Compositional Solutions

Yuling Jiao, Xiliang Lu, Jerry Zhijian Yang, Cheng Yuan & Pingwen Zhang

Ann. Appl. Math., 39 (2023), pp. 239-263.

Published online: 2023-09

Export citation
  • Abstract

In this paper, we present an improved analysis of the Physics Informed Neural Networks (PINNs) method for solving second-order elliptic equations. By assuming an intrinsic sparse structure in the underlying solution, we provide a convergence rate analysis that can overcome the curse of dimensionality (CoD). Specifically, using some approximation theory in Sobolev space together with the multivariate Faa di Bruno formula, we first derive the approximation error for composition functions with a small degree of freedom in each compositional layer. Furthermore, by integrating several results on the statistical error of neural networks, we obtain a refined convergence rate analysis for PINNs in solving elliptic equations with compositional solutions. We also demonstrate the benefits of the intrinsic sparse structure with two simple numerical examples.

  • AMS Subject Headings

68T07, 65N99

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{AAM-39-239, author = {Jiao , YulingLu , XiliangYang , Jerry ZhijianYuan , Cheng and Zhang , Pingwen}, title = {Improved Analysis of PINNs: Alleviate the CoD for Compositional Solutions}, journal = {Annals of Applied Mathematics}, year = {2023}, volume = {39}, number = {3}, pages = {239--263}, abstract = {

In this paper, we present an improved analysis of the Physics Informed Neural Networks (PINNs) method for solving second-order elliptic equations. By assuming an intrinsic sparse structure in the underlying solution, we provide a convergence rate analysis that can overcome the curse of dimensionality (CoD). Specifically, using some approximation theory in Sobolev space together with the multivariate Faa di Bruno formula, we first derive the approximation error for composition functions with a small degree of freedom in each compositional layer. Furthermore, by integrating several results on the statistical error of neural networks, we obtain a refined convergence rate analysis for PINNs in solving elliptic equations with compositional solutions. We also demonstrate the benefits of the intrinsic sparse structure with two simple numerical examples.

}, issn = {}, doi = {https://doi.org/ 10.4208/aam.OA-2023-0021}, url = {http://global-sci.org/intro/article_detail/aam/21993.html} }
TY - JOUR T1 - Improved Analysis of PINNs: Alleviate the CoD for Compositional Solutions AU - Jiao , Yuling AU - Lu , Xiliang AU - Yang , Jerry Zhijian AU - Yuan , Cheng AU - Zhang , Pingwen JO - Annals of Applied Mathematics VL - 3 SP - 239 EP - 263 PY - 2023 DA - 2023/09 SN - 39 DO - http://doi.org/ 10.4208/aam.OA-2023-0021 UR - https://global-sci.org/intro/article_detail/aam/21993.html KW - Composition function, deep neural network, approximation. AB -

In this paper, we present an improved analysis of the Physics Informed Neural Networks (PINNs) method for solving second-order elliptic equations. By assuming an intrinsic sparse structure in the underlying solution, we provide a convergence rate analysis that can overcome the curse of dimensionality (CoD). Specifically, using some approximation theory in Sobolev space together with the multivariate Faa di Bruno formula, we first derive the approximation error for composition functions with a small degree of freedom in each compositional layer. Furthermore, by integrating several results on the statistical error of neural networks, we obtain a refined convergence rate analysis for PINNs in solving elliptic equations with compositional solutions. We also demonstrate the benefits of the intrinsic sparse structure with two simple numerical examples.

Yuling Jiao, Xiliang Lu, Jerry Zhijian Yang, Cheng Yuan & Pingwen Zhang. (2023). Improved Analysis of PINNs: Alleviate the CoD for Compositional Solutions. Annals of Applied Mathematics. 39 (3). 239-263. doi: 10.4208/aam.OA-2023-0021
Copy to clipboard
The citation has been copied to your clipboard