Bayesian Learning with Wasserstein Barycenters.
arXiv: Machine Learning(2018)
摘要
We introduce a novel paradigm for Bayesian learning based on optimal transport theory. Namely, we propose to use the Wasserstein barycenter of the posterior law on models as a predictive posterior, thus introducing an alternative to classical choices like the maximum a posteriori estimator and the Bayesian model average. We exhibit conditions granting the existence and statistical consistency of this estimator, discuss some of its basic and specific properties, and provide insight into its theoretical advantages. Finally, we introduce a novel numerical method which is ideally suited for the computation of our estimator, and we explicitly discuss its implementations for specific families of models. This method can be seen as a stochastic gradient descent algorithm in the Wasserstein space, and is of independent interest and applicability for the computation of Wasserstein barycenters. We also provide an illustrative numerical example for experimental validation of the proposed method.
更多查看译文
关键词
Bayesian learning,non-parametric estimation,Wasserstein distance and barycenter,consistency,MCMC,stochastic gradient descent in Wasserstein space
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络