Revisiting the balance heuristic for estimating normalising constants

arxiv(2019)

引用 1|浏览3
暂无评分
摘要
Multiple importance sampling estimators are widely used for computing intractable constants due to its reliability and robustness. The celebrated balance heuristic estimator belongs to this class of methods and has proved very successful in computer graphics. The basic ingredients for computing the estimator are: a set of proposal distributions, indexed by some discrete label, and a predetermined number of draws from each of these proposals. However, if the number of available proposals is much larger than the number of permitted importance points, one needs to select, possibly at random, which of these distributions will be used. The focus of this work lies within the previous context, exploring some improvements and variations of the balance heuristic via a novel extended-space representation of the estimator, leading to straightforward annealing schemes for variance reduction purposes. In addition, we also look at the intractable scenario where the proposal density is only available as a joint function with the discrete label, as may be encountered in problems where an ordering is imposed. For this case, we look at combinations of correlated unbiased estimators which also fit into the extended-space representation and, in turn, will provide other interesting solutions.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要