Lower Bounds for XOR of Forrelations.

CoRR(2020)

引用 7|浏览60
暂无评分
摘要
The Forrelation problem, introduced by Aaronson [A10] and Aaronson and Ambainis [AA15], is a well studied problem in the context of separating quantum and classical models. Variants of this problem were used to give exponential separations between quantum and classical query complexity [A10, AA15]; quantum query complexity and bounded-depth circuits [RT19]; and quantum and classical communication complexity [GRT19]. In all these separations, the lower bound for the classical model only holds when the advantage of the protocol (over a random guess) is more than $\approx 1/\sqrt{N}$, that is, the success probability is larger than $\approx 1/2 + 1/\sqrt{N}$. To achieve separations when the classical protocol has smaller advantage, we study in this work the XOR of $k$ independent copies of the Forrelation function (where $k\ll N$). We prove a very general result that shows that any family of Boolean functions that is closed under restrictions, whose Fourier mass at level $2k$ is bounded by $\alpha^k$, cannot compute the XOR of $k$ independent copies of the Forrelation function with advantage better than $O\left(\frac{\alpha^k}{{N^{k/2}}}\right)$. This is a strengthening of a result of [CHLT19], that gave a similar result for $k=1$, using the technique of [RT19]. As an application of our result, we give the first example of a partial Boolean function that can be computed by a simultaneous-message quantum protocol of cost $\mbox{polylog}(N)$ (when players share $\mbox{polylog}(N)$ EPR pairs), however, any classical interactive randomized protocol of cost at most $\tilde{o}(N^{1/4})$, has quasipolynomially small advantage over a random guess. We also give the first example of a partial Boolean function that has a quantum query algorithm of cost $\mbox{polylog}(N)$, and such that, any constant-depth circuit of quasipolynomial size has quasipolynomially small advantage over a random guess.
更多
查看译文
关键词
lower bounds,xor,forrelations
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要