Differentially Private Confidence Intervals for Empirical Risk Minimization

J. Priv. Confidentiality(2018)

引用 10|浏览89
暂无评分
摘要
The process of data mining with differential privacy produces results that are affected by two types of noise: sampling noise due to data collection and privacy noise that is designed to prevent the reconstruction of sensitive information. In this paper, we consider the problem of designing confidence intervals for the parameters of a variety of differentially private machine learning models. The algorithms can provide confidence intervals that satisfy differential privacy (as well as the more recently proposed concentrated differential privacy) and can be used with existing differentially private mechanisms that train models using objective perturbation and output perturbation.
更多
查看译文
关键词
Differential Privacy,Objective Perturbation,Output Perturbation,Confidence Intervals
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要