Automatic Differentiation in Prolog

THEORY AND PRACTICE OF LOGIC PROGRAMMING(2023)

引用 7|浏览35
暂无评分
摘要
Automatic differentiation (AD) is a range of algorithms to compute the numeric value of a function's (partial) derivative, where the function is typically given as a computer program or abstract syntax tree. AD has become immensely popular as part of many learning algorithms, notably for neural networks. This paper uses Prolog to systematically derive gradient-based forward- and reverse-mode AD variants from a simple executable specification: evaluation of the symbolic derivative. Along the way we demonstrate that several Prolog features (DCGs, co-routines) contribute to the succinct formulation of the algorithm. We also discuss two applications in probabilistic programming that are enabled by our Prolog algorithms. The first is parameter learning for the Sum-Product Loop Language and the second consists of both parameter learning and variational inference for probabilistic logic programming.
更多
查看译文
关键词
Prolog,automatic differentiation,probabilistic programming
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要