An Asymptotic Expansion-Based Deep Neural Network for Solving Singularly Perturbed Problems with Exponential Boundary Layers
Abstract
Physics-informed neural networks (PINNs) have emerged as a powerful framework for solving partial differential equations (PDEs). However, their effectiveness deteriorates when applied to singularly perturbed problems, where solutions exhibit steep gradients and boundary layers confined to narrow regions features that standard PINNs architectures often fail to capture. To overcome these obstacles, we propose a novel, mesh-free deep neural networks (DNNs) method that incorporates asymptotic information through a systematic decomposition of the solution into smooth and sharp components. By capitalizing on the ability of DNNs to excel at approximating smooth functions, our approach constructs a series of moderately sized networks tailored to different elements of the solution. This strategy enables uniform approximation accuracy across a wide range of perturbation parameters, maintaining robustness and efficiency even in the extreme regime where the perturbation parameter is as small as $10^{−16},$ near machine precision. Key advantages of the proposed method include its conceptual simplicity, full independence from mesh requirements, and ease of implementation. Extensive numerical experiments confirm that our approach delivers significantly improved accuracy and efficiency compared to standard PINNs, demonstrating its potential as a robust and versatile framework for tackling singularly perturbed problems.
About this article