Sampling lower bounds via information theory

STOC(2003)

引用 48|浏览16
暂无评分
摘要
We present a novel technique, based on the Jensen-Shannon divergence from information theory, to prove lower bounds on the query complexity of sampling algorithms that approximate functions over arbitrary domain and range. Unlike previous methods, our technique does not use a reduction from a decision promise problem. As a result, it gives stronger bounds for functions that possess a large set of inputs, each two of which exhibit a gap in the function value.We demonstrate the technique with new query complexity lower bounds for three fundamental problems: (1) the "election problem", for which we obtain a quadratic improvement over previous bounds, (2) low rank matrix approximation, for which we prove the first lower bounds, showing that the algorithms given for this problem are almost optimal, and (3) matrix reconstruction.In addition, we introduce a new method for proving lower bounds on the expected query complexity of functions, using the Kullback-Leibler divergence. We demonstrate its use by a simple query complexity lower bound for the mean.
更多
查看译文
关键词
novel technique,query complexity,election problem,new query complexity,jensen-shannon divergence,simple query complexity,decision promise problem,lower bounds,fundamental problem,expected query complexity,information theory,lower bound,jensen shannon divergence,kullback leibler divergence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要