Small Count Privacy and Large Count Utility in Data Publishing

CoRR(2012)

引用 26|浏览40
暂无评分
摘要
While the introduction of differential privacy has been a major breakthrough in the study of privacy preserving data publication, some recent work has pointed out a number of cases where it is not possible to limit inference about individuals. The dilemma that is intrinsic in the problem is the simultaneous requirement of data utility in the published data. Differential privacy does not aim to protect information about an individual that can be uncovered even without the participation of the individual. However, this lack of coverage may violate the principle of individual privacy. Here we propose a solution by providing protection to sensitive information, by which we refer to the answers for aggregate queries with small counts. Previous works based on $\ell$-diversity can be seen as providing a special form of this kind of protection. Our method is developed with another goal which is to provide differential privacy guarantee, and for that we introduce a more refined form of differential privacy to deal with certain practical issues. Our empirical studies show that our method can preserve better utilities than a number of state-of-the-art methods although these methods do not provide the protections that we provide.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要