Training neural operators to preserve invariant measures of chaotic attractors
arxiv(2023)
摘要
Chaotic systems make long-horizon forecasts difficult because small
perturbations in initial conditions cause trajectories to diverge at an
exponential rate. In this setting, neural operators trained to minimize squared
error losses, while capable of accurate short-term forecasts, often fail to
reproduce statistical or structural properties of the dynamics over longer time
horizons and can yield degenerate results. In this paper, we propose an
alternative framework designed to preserve invariant measures of chaotic
attractors that characterize the time-invariant statistical properties of the
dynamics. Specifically, in the multi-environment setting (where each sample
trajectory is governed by slightly different dynamics), we consider two novel
approaches to training with noisy data. First, we propose a loss based on the
optimal transport distance between the observed dynamics and the neural
operator outputs. This approach requires expert knowledge of the underlying
physics to determine what statistical features should be included in the
optimal transport loss. Second, we show that a contrastive learning framework,
which does not require any specialized prior knowledge, can preserve
statistical properties of the dynamics nearly as well as the optimal transport
approach. On a variety of chaotic systems, our method is shown empirically to
preserve invariant measures of chaotic attractors.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要