SoK: A Review of Differentially Private Linear Models For High-Dimensional Data
arxiv(2024)
摘要
Linear models are ubiquitous in data science, but are particularly prone to
overfitting and data memorization in high dimensions. To guarantee the privacy
of training data, differential privacy can be used. Many papers have proposed
optimization techniques for high-dimensional differentially private linear
models, but a systematic comparison between these methods does not exist. We
close this gap by providing a comprehensive review of optimization methods for
private high-dimensional linear models. Empirical tests on all methods
demonstrate robust and coordinate-optimized algorithms perform best, which can
inform future research. Code for implementing all methods is released online.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要