Preconditioning for Physics-Informed Neural Networks
CoRR(2024)
摘要
Physics-informed neural networks (PINNs) have shown promise in solving
various partial differential equations (PDEs). However, training pathologies
have negatively affected the convergence and prediction accuracy of PINNs,
which further limits their practical applications. In this paper, we propose to
use condition number as a metric to diagnose and mitigate the pathologies in
PINNs. Inspired by classical numerical analysis, where the condition number
measures sensitivity and stability, we highlight its pivotal role in the
training dynamics of PINNs. We prove theorems to reveal how condition number is
related to both the error control and convergence of PINNs. Subsequently, we
present an algorithm that leverages preconditioning to improve the condition
number. Evaluations of 18 PDE problems showcase the superior performance of our
method. Significantly, in 7 of these problems, our method reduces errors by an
order of magnitude. These empirical findings verify the critical role of the
condition number in PINNs' training.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要