Graphical Model Inference using GPUs

msra(2007)

引用 23|浏览29
暂无评分
摘要
Graphical modeling is a method that is increasingly being used to model real-life applications such as medical diagnosis, security and emergency response systems, and computer vision. A graphical model represents random variables and dependencies between them by means of a graph, in which some variables may not be observable in practice. An important problem in graphical models is that of inference, the process of finding the probability distributions of hidden variables given those of the observed variables. A general technique for inference in graphical models uses junction trees. Inference on junction trees is computationally intensive and has a lot of data-level parallelism. However, the large number and lack of locality of memory accesses limits exploitable parallelism on CPU platforms. Graphical Processing Units(GPUs) are well suited to exploit data-level parallelism, and are being increasingly used for scientific computations. In this work, we exploit the support for data-level parallelism on GPUs to speedup exact inference on junction trees and rely on multi-threading to hide memory latency. We achieve two orders of magnitude speedup on a CPU/GPU system using an NVIDIA GeForce 8800 part over a standalone CPU system.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要