Multiscale Neural Operator: Learning Fast and Grid-independent PDE Solvers

arxiv(2022)

引用 0|浏览24
暂无评分
摘要
Numerical simulations in climate, chemistry, or astrophysics are computationally too expensive for uncertainty quantification or parameter-exploration at high-resolution. Reduced-order or surrogate models are multiple orders of magnitude faster, but traditional surrogates are inflexible or inaccurate and pure machine learning (ML)-based surrogates too data-hungry. We propose a hybrid, flexible surrogate model that exploits known physics for simulating large-scale dynamics and limits learning to the hard-to-model term, which is called parametrization or closure and captures the effect of fine- onto large-scale dynamics. Leveraging neural operators, we are the first to learn grid-independent, non-local, and flexible parametrizations. Our \textit{multiscale neural operator} is motivated by a rich literature in multiscale modeling, has quasilinear runtime complexity, is more accurate or flexible than state-of-the-art parametrizations and demonstrated on the chaotic equation multiscale Lorenz96.
更多
查看译文
关键词
physics-informed machine learning,pinns,scientific machine learning,neural ODEs,neural operators,machine learning,neural networks,Matryoshka,multiphysics,multiscale,parametrizations,closure,subgrid,superstructures,partial differential equations,PDEs,differential equations,numerical solvers,physics,hpc,surrogate,reduced order modeling,model reduction,uncertainty quantification,climate,fluid dynamics,physics,computational physics
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要