Multi-Lattice Sampling of Quantum Field Theories via Neural Operator-based Flows

CoRR(2024)

引用 0|浏览0
暂无评分
摘要
We consider the problem of sampling discrete field configurations ϕ from the Boltzmann distribution [dϕ] Z^-1 e^-S[ϕ], where S is the lattice-discretization of the continuous Euclidean action 𝒮 of some quantum field theory. Since such densities arise as the approximation of the underlying functional density [𝒟ϕ(x)] 𝒵^-1 e^-𝒮[ϕ(x)], we frame the task as an instance of operator learning. In particular, we propose to approximate a time-dependent operator 𝒱_t whose time integral provides a mapping between the functional distributions of the free theory [𝒟ϕ(x)] 𝒵_0^-1 e^-𝒮_0[ϕ(x)] and of the target theory [𝒟ϕ(x)]𝒵^-1e^-𝒮[ϕ(x)]. Whenever a particular lattice is chosen, the operator 𝒱_t can be discretized to a finite dimensional, time-dependent vector field V_t which in turn induces a continuous normalizing flow between finite dimensional distributions over the chosen lattice. This flow can then be trained to be a diffeormorphism between the discretized free and target theories [dϕ] Z_0^-1 e^-S_0[ϕ], [dϕ] Z^-1e^-S[ϕ]. We run experiments on the ϕ^4-theory to explore to what extent such operator-based flow architectures generalize to lattice sizes they were not trained on and show that pretraining on smaller lattices can lead to speedup over training only a target lattice size.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要