## AI helps you reading Science

## AI Insight

AI extracts a summary of this paper

Weibo:

# A Multiplicative Weights Mechanism for Privacy-Preserving Data Analysis

FOCS, pp.61-70, (2010)

EI

Keywords

Abstract

We consider statistical data analysis in the interactive setting. In this setting a trusted curator maintains a database of sensitive information about individual participants, and releases privacy-preserving answers to queries as they arrive. Our primary contribution is a new differentially private multiplicative weights mechanism for an...More

Code:

Data:

Introduction

- Statistical analysis of sensitive information about individuals comes with important benefits.
- Sublinear Time Mechanism for Smooth Databases: The authors observe that the authors can modify the PMW mechanism to work over a smaller data universe V ⊆ U , as long as there exists a database x∗ whose support is only over V , and gives close answers to those of x on every query the authors will be asked.

Highlights

- Statistical analysis of sensitive information about individuals comes with important benefits
- This research has yielded the robust privacy guarantee of differential privacy, due to Dwork et al [5], which guarantees that the outcome of the analysis on adjacent databases is “very similar”
- Differential privacy guarantees that participation in the analysis does not incur significant additional risk for individuals
- Throughout this paper and most of the prior work, the focus is on the setting where a trusted curator, holding a database of potentially sensitive information about n individuals, wishes to release statistics about the data while protecting individuals’ privacy
- Sublinear Time Mechanism for Smooth Databases: We observe that we can modify the private multiplicative weights mechanism to work over a smaller data universe V ⊆ U , as long as there exists a database x∗ whose support is only over V , and gives close answers to those of x on every query we will be asked

Results

- The authors find this utility guarantee to still be well motivated—note that, privacy aside, the input database itself, which is sampled i.i.d from an underlying distribution, isn’t guaranteed to yield good answers for adaptively chosen queries.
- The authors say that a mechanism M is (α, β, k) accurate for a database x, if when it is run for k rounds, for any linear queries, with all but β probability over the mechanism’s coins ∀t ∈ [k], |at − ft, x | ≤ α.
- The authors say that a mechanism M is (α, β, k)-non-adaptively accurate for a query sequence C of size k and a database x, if when it is run for k rounds on the queries in C, with all but β probability over the mechanism’s coins ∀t ∈ [k], |at − ft, x | ≤ α.
- The interaction of a mechanism M (x) and an adversary A specifies a probability distribution [M (x), A] over transcripts, i.e., sequences of queries and answers f1, a1, f2, a2, .
- The authors say a mechanism M provides (ε, δ)differential privacy for a class of queries F, if for every adversary A and every two histograms x, x ∈ RN satisfying x − x 1 ≤ 1/n, the following is true: Let P = [M (x), A] denote the transcript between M (x) and A.
- In the PMW mechanism of Figure 1, in each round t, the authors are given a linear query ft over U and xt denotes a fractional histogram computed in round t.
- The authors define a notion of average case complexity for interactive mechanisms that allows them to improve the running time of the PMW mechanism as a function of the data universe size.

Conclusion

- Let U be a data universe, C a class of linear queries over U , and x∗ a ξ-smooth histogram over U .
- For a given smoothness parameter ξ, data universe U , and query class C, let V ⊆ U be a sub-universe sampled uniformly and at random from U .

Reference

- S. Arora, E. Hazan, and S. Kale. The multiplicative weights update method: a meta algorithm and applications. Technical report, Princeton University, 2005.
- A. Blum, K. Ligett, and A. Roth. A learning theory approach to non-interactive database privacy. In STOC ’08: Proceedings of the 40th annual ACM symposium on Theory of computing, pages 609–618, New York, NY, USA, 2008. ACM.
- I. Dinur and K. Nissim. Revealing information while preserving privacy. In Proc. 22nd PODS, pages 202–210. ACM, 2003.
- C. Dwork and J. Lei. Differential privacy and robust statistics. In Proc. 41st STOC, pages 371–380. ACM, 2009.
- C. Dwork, F. McSherry, K. Nissim, and A. Smith. Calibrating noise to sensitivity in private data analysis. In Proc. 3rd TCC, pages 265–284.
- C. Dwork, F. McSherry, and K. Talwar. The price of privacy and the limits of LP decoding. In Proc. 39th STOC, pages 85–94. ACM, 2007.
- C. Dwork, M. Naor, T. Pitassi, and G. N. Rothblum. Differential privacy under continual observation. In Proc. 42nd STOC. ACM, 2010.
- C. Dwork, M. Naor, O. Reingold, G. N. Rothblum, and S. P. Vadhan. On the complexity of differentially private data release: efficient algorithms and hardness results. In Proc. 41st STOC, pages 381–390. ACM, 2009.
- C. Dwork and K. Nissim. Privacy-preserving datamining on vertically partitioned databases. In Proc. 24th CRYPTO, pages 528–544.
- C. Dwork, G. N. Rothblum, and S. Vadhan. Boosting and differential privacy. Manuscript, 2010.
- C. Dwork and S. Yekhanin. New efficient attacks on statistical disclosure control mechanisms. In Proc. 28th CRYPTO, pages 469–480.
- N. Littlestone and M. K. Warmuth. The weighted majority algorithm. Inf. Comput., 108(2):212–261, 1994.
- A. Roth and T. Roughgarden. Interactive privacy via the median mechanism. In STOC, pages 765–774, 2010.
- J. Ullman and S. Vadhan. PCPs and the hardness of generating synthetic data. Electronic Colloquium on Computational Complexity (ECCC), 1(17), 2010.

Tags

Comments

数据免责声明

页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果，我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问，可以通过电子邮件方式联系我们：report@aminer.cn