Sampling with Mirrored Stein Operators
International Conference on Learning Representations (ICLR)(2022)
摘要
We introduce a new family of particle evolution samplers suitable for constrained domains and non-Euclidean geometries. Stein Variational Mirror Descent and Mirrored Stein Variational Gradient Descent minimize the Kullback-Leibler (KL) divergence to constrained target distributions by evolving particles in a dual space defined by a mirror map. Stein Variational Natural Gradient exploits non-Euclidean geometry to more efficiently minimize the KL divergence to unconstrained targets. We derive these samplers from a new class of mirrored Stein operators and adaptive kernels developed in this work. We demonstrate that these new samplers yield accurate approximations to distributions on the simplex, deliver valid confidence intervals in post-selection inference, and converge more rapidly than prior methods in large-scale unconstrained posterior inference. Finally, we establish the convergence of our new procedures under verifiable conditions on the target distribution.
更多查看译文
关键词
Stein's method,Sampling,Mirror descent,Natural gradient descent,Probabilistic inference,Bayesian inference,Post-selection inference,Stein operators
AI 理解论文
溯源树
样例

生成溯源树,研究论文发展脉络