Patterns of Scalable Bayesian Inference.

Foundations and Trends in Machine Learning(2016)

引用 84|浏览132
暂无评分
摘要
Datasets are growing not just in size but in complexity, creating a demand for rich models and quantification of uncertainty. Bayesian methods are an excellent fit for this demand, but scaling Bayesian inference is a challenge. In response to this challenge, there has been considerable recent work based on varying assumptions about model structure, underlying computational resources, and the importance of asymptotic correctness. As a result, there is a zoo of ideas with a wide range of assumptions and applicability. Patterns of Scalable Bayesian Inference seeks to identify unifying principles, patterns, and intuitions for scaling Bayesian inference. It examines how these techniques can be scaled up to larger problems and scaled out across parallel computational resources. It reviews existing work on utilizing modern computing resources with both MCMC and variational approximation techniques. From this taxonomy of ideas, it characterizes the general principles that have proven successful for designing scalable inference procedures and addresses some of the significant open questions and challenges.
更多
查看译文
关键词
Machine Learning,Theoretical Computer Science,Bayesian learning,Markov chain Monte Carlo,Variational Inference,Parallel algorithms
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要