Data-Efficient Operator Learning via Unsupervised Pretraining and In-Context Learning
CoRR(2024)
摘要
Recent years have witnessed the promise of coupling machine learning methods
and physical domain-specific insight for solving scientific problems based on
partial differential equations (PDEs). However, being data-intensive, these
methods still require a large amount of PDE data. This reintroduces the need
for expensive numerical PDE solutions, partially undermining the original goal
of avoiding these expensive simulations. In this work, seeking data efficiency,
we design unsupervised pretraining and in-context learning methods for PDE
operator learning. To reduce the need for training data with simulated
solutions, we pretrain neural operators on unlabeled PDE data using
reconstruction-based proxy tasks. To improve out-of-distribution performance,
we further assist neural operators in flexibly leveraging in-context learning
methods, without incurring extra training costs or designs. Extensive empirical
evaluations on a diverse set of PDEs demonstrate that our method is highly
data-efficient, more generalizable, and even outperforms conventional
vision-pretrained models.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要