Efficient Sampling on Riemannian Manifolds via Langevin MCMC
NeurIPS(2024)
摘要
We study the task of efficiently sampling from a Gibbs distribution d π^*
= e^-h d vol_g over a Riemannian manifold M via (geometric) Langevin
MCMC; this algorithm involves computing exponential maps in random Gaussian
directions and is efficiently implementable in practice. The key to our
analysis of Langevin MCMC is a bound on the discretization error of the
geometric Euler-Murayama scheme, assuming ∇ h is Lipschitz and M has
bounded sectional curvature. Our error bound matches the error of Euclidean
Euler-Murayama in terms of its stepsize dependence. Combined with a contraction
guarantee for the geometric Langevin Diffusion under Kendall-Cranston coupling,
we prove that the Langevin MCMC iterates lie within ϵ-Wasserstein
distance of π^* after Õ(ϵ^-2) steps, which matches the
iteration complexity for Euclidean Langevin MCMC. Our results apply in general
settings where h can be nonconvex and M can have negative Ricci curvature.
Under additional assumptions that the Riemannian curvature tensor has bounded
derivatives, and that π^* satisfies a CD(·,∞) condition, we
analyze the stochastic gradient version of Langevin MCMC, and bound its
iteration complexity by Õ(ϵ^-2) as well.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要