On-Average Kl-Privacy And Its Equivalence To Generalization For Max-Entropy Mechanisms

PRIVACY IN STATISTICAL DATABASES: UNESCO CHAIR IN DATA PRIVACY(2016)

引用 30|浏览74
暂无评分
摘要
We define On-Average KL-Privacy and present its properties and connections to differential privacy, generalization and information theoretic quantities including max-information and mutual information. The new definition significantly weakens differential privacy, while preserving its minimal design features such as composition over small group and multiple queries as well as closeness to post-processing. Moreover, we show that On-Average KL-Privacy is equivalent to generalization for a large class of commonly-used tools in statistics and machine learning that samples from Gibbs distributions-a class of distributions that arises naturally from the maximum entropy principle. In addition, a by product of our analysis yields a lower bound for generalization error in terms of mutual information which reveals an interesting interplay with known upper bounds that use the same quantity.
更多
查看译文
关键词
Differential privacy, Generalization, Stability, Information theory, Maximum entropy, Statistical learning theory
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要