Modified Hamiltonian Monte Carlo for Bayesian inference

Statistics and Computing(2019)

引用 11|浏览0
暂无评分
摘要
The Hamiltonian Monte Carlo (HMC) method has been recognized as a powerful sampling tool in computational statistics. We show that performance of HMC can be significantly improved by incorporating importance sampling and an irreversible part of the dynamics into a chain. This is achieved by replacing Hamiltonians in the Metropolis test with modified Hamiltonians and a complete momentum update with a partial momentum refreshment. We call the resulting generalized HMC importance sampler Mix & Match Hamiltonian Monte Carlo (MMHMC). The method is irreversible by construction and further benefits from (i) the efficient algorithms for computation of modified Hamiltonians; (ii) the implicit momentum update procedure and (iii) the multistage splitting integrators specially derived for the methods sampling with modified Hamiltonians. MMHMC has been implemented, tested on the popular statistical models and compared in sampling efficiency with HMC, Riemann Manifold Hamiltonian Monte Carlo, Generalized Hybrid Monte Carlo, Generalized Shadow Hybrid Monte Carlo, Metropolis Adjusted Langevin Algorithm and Random Walk Metropolis–Hastings. To make a fair comparison, we propose a metric that accounts for correlations among samples and weights and can be readily used for all methods which generate such samples. The experiments reveal the superiority of MMHMC over popular sampling techniques, especially in solving high-dimensional problems.
更多
查看译文
关键词
Bayesian inference, Markov chain Monte Carlo, Hamiltonian Monte Carlo, Importance sampling, Modified Hamiltonians
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要