Private Gradient Descent for Linear Regression: Tighter Error Bounds and Instance-Specific Uncertainty Estimation
CoRR(2024)
摘要
We provide an improved analysis of standard differentially private gradient
descent for linear regression under the squared error loss. Under modest
assumptions on the input, we characterize the distribution of the iterate at
each time step.
Our analysis leads to new results on the algorithm's accuracy: for a proper
fixed choice of hyperparameters, the sample complexity depends only linearly on
the dimension of the data. This matches the dimension-dependence of the
(non-private) ordinary least squares estimator as well as that of recent
private algorithms that rely on sophisticated adaptive gradient-clipping
schemes (Varshney et al., 2022; Liu et al., 2023).
Our analysis of the iterates' distribution also allows us to construct
confidence intervals for the empirical optimizer which adapt automatically to
the variance of the algorithm on a particular data set. We validate our
theorems through experiments on synthetic data.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要