Approximate resilience, monotonicity, and the complexity of agnostic learning

SODA(2015)

引用 28|浏览35
暂无评分
摘要
A function f is d-resilient if all its Fourier coefficients of degree at most d are zero, i.e. f is uncorrelated with all low-degree parities. We study the notion of approximate resilience of Boolean functions, where we say that f is α-approximately d-resilient if f is α-close to a [−1, 1]-valued d-resilient function in l1 distance. We show that approximate resilience essentially characterizes the complexity of agnostic learning of a concept class C over the uniform distribution. Roughly speaking, if all functions in a class C are far from being d-resilient then C can be learned agnostically in time nO(d) and conversely, if C contains a function close to being d-resilient then agnostic learning of C in the statistical query (SQ) framework of Kearns has complexity of at least nΩ(d). Focusing on monotone Boolean functions, we exhibit the existence of near-optimal α-approximately [EQUATION](α[EQUATION]n)-resilient monotone functions for all α > 0. Prior to our work, it was conceivable even that every monotone function is Ω(1)-far from any 1-resilient function. Furthermore, we construct simple, explicit monotone functions based on Tribes and CycleRun that are close to highly resilient functions. Our constructions are based on general resilience analysis and amplification techniques we introduce. These structural results, together with the characterization, imply nearly optimal lower bounds for agnostic learning of monotone juntas, a natural variant of the well-studied junta learning problem. In particular we show that no SQ algorithm can efficiently agnostically learn monotone k-juntas for any k = ω(1) and any constant error less than 1/2.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要