Learning Markov graphs up to edit distance.

ISIT(2012)

引用 12|浏览22
暂无评分
摘要
This paper presents a rate distortion approach to Markov graph learning. It provides lower bounds on the number of samples required for any algorithm to learn the Markov graph structure of a probability distribution, up to edit distance. We first prove a general result for any probability distribution, and then specialize it for Ising and Gaussian models. In particular, for both Ising and Gaussian models on p variables with degree at most d, we show that at least Omega((d - s/p) log p) samples are required for any algorithm to learn the graph structure up to edit distance s. Our bounds represent a strong converse; i.e., we show that for a lower number of samples, the probability of error goes to 1 as the problem size increases. These results show that substantial gains in sample complexity may not be possible without paying a significant price in edit distance error.
更多
查看译文
关键词
Gaussian processes,Markov processes,graph theory,probability,Gaussian model,Ising model,Markov graph,distance editing,error probability distribution,rate distortion approach,Gaussian Markov model,Ising model,Markov networks,graphical models,strong converse
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要