Scientific Machine Learning Through Physics-Informed Neural Networks: Where we are and What's Next

被引:1126
作者
Cuomo, Salvatore [1 ]
Di Cola, Vincenzo Schiano [2 ]
Giampaolo, Fabio [1 ]
Rozza, Gianluigi [3 ]
Raissi, Maziar [4 ]
Piccialli, Francesco [1 ]
机构
[1] Univ Naples Federico II, Dept Math & Applicat Renato Caccioppoli, I-80126 Naples, Italy
[2] Univ Naples Federico II, Dept Elect Engn & Informat Technol, Via Claudio, I-80125 Naples, Italy
[3] SISSA, Int Sch Adv Studies, Math Area, MathLab, Via Bonomea 265, I-34136 Trieste, Italy
[4] Univ Colorado, Dept Appl Math, Boulder, CO 80309 USA
关键词
Physics-Informed Neural Networks; Scientific Machine Learning; Deep Neural Networks; Nonlinear equations; Numerical methods; Partial Differential Equations; Uncertainty; UNCERTAINTY QUANTIFICATION; ARTIFICIAL-INTELLIGENCE; DIFFERENTIAL-EQUATIONS; SOLVING ORDINARY; APPROXIMATION; ALGORITHM; PRINCIPLES; MODEL;
D O I
10.1007/s10915-022-01939-z
中图分类号
O29 [应用数学];
学科分类号
070104 [应用数学];
摘要
Physics-Informed Neural Networks (PINN) are neural networks (NNs) that encode model equations, like Partial Differential Equations (PDE), as a component of the neural network itself. PINNs are nowadays used to solve PDEs, fractional equations, integral-differential equations, and stochastic PDEs. This novel methodology has arisen as a multi-task learning framework in which a NN must fit observed data while reducing a PDE residual. This article provides a comprehensive review of the literature on PINNs: while the primary goal of the study was to characterize these networks and their related advantages and disadvantages. The review also attempts to incorporate publications on a broader range of collocation-based physics informed neural networks, which stars form the vanilla PINN, as well as many other variants, such as physics-constrained neural networks (PCNN), variational hp-VPINN, and conservative PINN (CPINN). The study indicates that most research has focused on customizing the PINN through different activation functions, gradient optimization techniques, neural network structures, and loss function structures. Despite the wide range of applications for which PINNs have been used, by demonstrating their ability to be more feasible in some contexts than classical numerical techniques like Finite Element Method (FEM), advancements are still possible, most notably theoretical issues that remain unresolved.
引用
收藏
页数:62
相关论文
共 201 条
[31]   Deep Learning Method Based on Physics Informed Neural Network with Resnet Block for Solving Fluid Flow Problems [J].
Cheng, Chen ;
Zhang, Guang-Tao .
WATER, 2021, 13 (04)
[32]   Recent advance in machine learning for partial differential equation [J].
Cheung, Ka Chun ;
See, Simon .
CCF TRANSACTIONS ON HIGH PERFORMANCE COMPUTING, 2021, 3 (03) :298-310
[33]   CAN-PINN: A fast physics-informed neural network based on coupled-automatic-numerical differentiation method [J].
Chiu, Pao-Hsiung ;
Wong, Jian Cheng ;
Ooi, Chinchun ;
Ha Dao, My ;
Ong, Yew-Soon .
COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2022, 395
[34]   Physics-Informed Neural Networks for Cardiac Activation Mapping [J].
Costabal, Francisco Sahli ;
Yang, Yibo ;
Perdikaris, Paris ;
Hurtado, Daniel E. ;
Kuhl, Ellen .
FRONTIERS IN PHYSICS, 2020, 8
[35]  
Cybenko G., 1989, Mathematics of Control, Signals, and Systems, V2, P303, DOI 10.1007/BF02551274
[36]   A Survey of Deep Learning and Its Applications: A New Paradigm to Machine Learning [J].
Dargan, Shaveta ;
Kumar, Munish ;
Ayyagari, Maruthi Rohit ;
Kumar, Gulshan .
ARCHIVES OF COMPUTATIONAL METHODS IN ENGINEERING, 2020, 27 (04) :1071-1092
[37]  
De Ryck T., 2021, arXiv
[38]   On the approximation of functions by tanh neural networks [J].
De Ryck, Tim ;
Lanthaler, Samuel ;
Mishra, Siddhartha .
NEURAL NETWORKS, 2021, 143 :732-750
[39]  
DeRyck T., ARXIV220309346 CS MA
[40]   NEURAL-NETWORK-BASED APPROXIMATIONS FOR SOLVING PARTIAL-DIFFERENTIAL EQUATIONS [J].
DISSANAYAKE, MWMG ;
PHANTHIEN, N .
COMMUNICATIONS IN NUMERICAL METHODS IN ENGINEERING, 1994, 10 (03) :195-201