A Sampling Technique of Proving Lower Bounds for Noisy Computations.

CoRR(2015)

引用 23|浏览5
暂无评分
摘要
We present a technique of proving lower bounds for noisy computations. This is achieved by a theorem connecting computations on a kind of randomized decision trees and sampling based algorithms. This approach is surprisingly powerful, and applicable to several models of computation previously studied. As a first illustration we show how all the results of Evans and Pippenger (SIAM J. Computing, 1999) for noisy decision trees, some of which were derived using Fourier analysis, follow immediately if we consider the sampling-based algorithms that naturally arise from these decision trees. Next, we show a tight lower bound of $\Omega(N \log\log N)$ on the number of transmissions required to compute several functions (including the parity function and the majority function) in a network of $N$ randomly placed sensors, communicating using local transmissions, and operating with power near the connectivity threshold. This result considerably simplifies and strengthens an earlier result of Dutta, Kanoria Manjunath and Radhakrishnan (SODA 08) that such networks cannot compute the parity function reliably with significantly fewer than $N\log \log N$ transmissions. The lower bound for parity shown earlier made use of special properties of the parity function and is inapplicable, e.g., to the majority function. In this paper, we use our approach to develop an interesting connection between computation of boolean functions on noisy networks that make few transmissionss, and algorithms that work by sampling only a part of the input. It is straightforward to verify that such sampling-based algorithms cannot compute the majority function.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要