On the complexity of computational problems regarding distributions

Studies in complexity and cryptography(2011)

引用 12|浏览26
暂无评分
摘要
We consider two basic computational problems regarding discrete probability distributions: (1) approximating the statistical difference (aka variation distance) between two given distributions, and (2) approximating the entropy of a given distribution. Both problems are considered in two different settings. In the first setting the approximation algorithm is only given samples from the distributions in question, whereas in the second setting the algorithm is given the "code" of a sampling device (for the distributions in question). We survey the know results regarding both settings, noting that they are fundamentally different: The first setting is concerned with the number of samples required for determining the quantity in question, and is thus essentially information theoretic. In the second setting the quantities in question are determined by the input, and the question is merely one of computational complexity. The focus of this survey is actually on the latter setting. In particular, the survey includes proof sketches of three central results regarding the latter setting, where one of these proofs has only appeared before in the second author's PhD Thesis.
更多
查看译文
关键词
discrete probability distribution,aka variation distance,latter setting,different setting,information theoretic,approximation algorithm,phd thesis,central result,computational complexity,basic computational problem,zero knowledge,approximation,reductions,entropy
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要