Year: 2023
Author: Honghui Wang, Lu Lu, Shiji Song, Gao Huang
Communications in Computational Physics, Vol. 34 (2023), Iss. 4 : pp. 869–906
Abstract
Physics-informed neural networks (PINNs) are known to suffer from optimization difficulty. In this work, we reveal the connection between the optimization difficulty of PINNs and activation functions. Specifically, we show that PINNs exhibit high sensitivity to activation functions when solving PDEs with distinct properties. Existing works usually choose activation functions by inefficient trial-and-error. To avoid the inefficient manual selection and to alleviate the optimization difficulty of PINNs, we introduce adaptive activation functions to search for the optimal function when solving different problems. We compare different adaptive activation functions and discuss their limitations in the context of PINNs. Furthermore, we propose to tailor the idea of learning combinations of candidate activation functions to the PINNs optimization, which has a higher requirement for the smoothness and diversity on learned functions. This is achieved by removing activation functions which cannot provide higher-order derivatives from the candidate set and incorporating elementary functions with different properties according to our prior knowledge about the PDE at hand. We further enhance the search space with adaptive slopes. The proposed adaptive activation function can be used to solve different PDE systems in an interpretable way. Its effectiveness is demonstrated on a series of benchmarks. Code is available at https://github.com/LeapLabTHU/AdaAFforPINNs.
You do not have full access to this article.
Already a Subscriber? Sign in as an individual or via your institution
Journal Article Details
Publisher Name: Global Science Press
Language: English
DOI: https://doi.org/10.4208/cicp.OA-2023-0058
Communications in Computational Physics, Vol. 34 (2023), Iss. 4 : pp. 869–906
Published online: 2023-01
AMS Subject Headings: Global Science Press
Copyright: COPYRIGHT: © Global Science Press
Pages: 38
Keywords: Partial differential equations deep learning adaptive activation functions physics-informed neural networks.
Author Details
-
Learning Feynman integrals from differential equations with neural networks
Calisto, Francesco | Moodie, Ryan | Zoia, SimoneJournal of High Energy Physics, Vol. 2024 (2024), Iss. 7
https://doi.org/10.1007/JHEP07(2024)124 [Citations: 0] -
A comprehensive review of advances in physics-informed neural networks and their applications in complex fluid dynamics
Zhao, Chi | Zhang, Feifei | Lou, Wenqiang | Wang, Xi | Yang, JianyongPhysics of Fluids, Vol. 36 (2024), Iss. 10
https://doi.org/10.1063/5.0226562 [Citations: 0] -
Simple yet effective adaptive activation functions for physics-informed neural networks
Zhang, Jun | Ding, ChensenComputer Physics Communications, Vol. 307 (2025), Iss. P.109428
https://doi.org/10.1016/j.cpc.2024.109428 [Citations: 0] -
Render unto Numerics: Orthogonal Polynomial Neural Operator for PDEs with Nonperiodic Boundary Conditions
Liu, Ziyuan | Wang, Haifeng | Zhang, Hong | Bao, Kaijun | Qian, Xu | Song, SongheSIAM Journal on Scientific Computing, Vol. 46 (2024), Iss. 4 P.C323
https://doi.org/10.1137/23M1556320 [Citations: 0] -
Transfer learning on physics-informed neural networks for tracking the hemodynamics in the evolving false lumen of dissected aorta
Daneker, Mitchell | Cai, Shengze | Qian, Ying | Myzelev, Eric | Kumbhat, Arsh | Li, He | Lu, LuNexus, Vol. 1 (2024), Iss. 2 P.100016
https://doi.org/10.1016/j.ynexs.2024.100016 [Citations: 0] -
Toward Physics-Informed Machine-Learning-Based Predictive Maintenance for Power Converters—A Review
Fassi, Youssof | Heiries, Vincent | Boutet, Jerome | Boisseau, SebastienIEEE Transactions on Power Electronics, Vol. 39 (2024), Iss. 2 P.2692
https://doi.org/10.1109/TPEL.2023.3328438 [Citations: 10]