Asymptotically exact conditional inference in deep generative models and differentiable simulators

arXiv: Computation(2016)

引用 23|浏览32
暂无评分
摘要
Many generative models can be expressed as a deterministic differentiable function of random inputs drawn from some simple probability density. This framework includes both deep generative architectures such as Variational Autoencoders and a large class of dynamical system simulators. We present a method for performing efficient MCMC inference in such models when conditioning on observations of the model output. For some models this offers an asymptotically exact inference method where Approximate Bayesian Computation might otherwise be employed. We use the intuition that conditional inference corresponds to integrating a density across the manifold corresponding to the set of inputs consistent with the observed outputs. This motivates the use of a constrained variant of Hamiltonian Monte Carlo which leverages the smooth geometry of the manifold to coherently move between states exactly satisfying the constraint. We validate the method by performing inference tasks in a diverse set of models: parameter inference in a dynamical predator-prey simulation, joint 3D pose and camera model inference from 2D projections and image in-painting with a generative model of MNIST digit images.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要