Expectation Maximization In Deep Probabilistic Logic Programming
AI*IA 2018 - ADVANCES IN ARTIFICIAL INTELLIGENCE(2018)
摘要
Probabilistic Logic Programming (PLP) combines logic and probability for representing and reasoning over domains with uncertainty. Hierarchical probability Logic Programming (HPLP) is a recent language of PLP whose clauses are hierarchically organized forming a deep neural network or arithmetic circuit. Inference in HPLP is done by circuit evaluation and learning is therefore cheaper than any generic PLP language. We present in this paper an Expectation Maximization algorithm, called Expectation Maximization Parameter learning for HIerarchical Probabilistic Logic programs (EMPHIL), for learning HPLP parameters. The algorithm converts an arithmetic circuit into a Bayesian network and performs the belief propagation algorithm over the corresponding factor graph.
更多查看译文
关键词
Hierarchical probabilistic logic programming, Arithmetic circuits, Expectation Maximization, Factor graph, Belief propagation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络