$\alpha$ Belief Propagation as Fully Factorized Approximation

IEEE Global Conference on Signal and Information Processing(2019)

引用 0|浏览19
暂无评分
摘要
Belief propagation (BP) can do exact inference in loop-free graphs, but its performance could be poor in graphs with loops, and the understanding of its solution is limited. This work gives an interpretable belief propagation rule that is actually minimization of a localized alpha-divergence. We term this algorithm as alpha belief propagation (alpha-BP). The performance of alpha-BP is tested in MAP (maximum a posterior) inference problems, where alpha-BP can outperform (loopy) BP by a significant margin even in fully-connected graphs.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要