ReSQueing Parallel and Private Stochastic Convex Optimization

2023 IEEE 64TH ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE, FOCS(2023)

引用 6|浏览94
暂无评分
摘要
We introduce a new tool for stochastic convex optimization (SCO): a Reweighted Stochastic Query (ReSQue) estimator for the gradient of a function convolved with a (Gaussian) probability density. Combining ReSQue with recent advances in ball oracle acceleration [CJJ(+) 20], [ACJ(+) 21], we develop algorithms achieving state-of-the-art complexities for SCO in parallel and private settings. For a SCO objective constrained to the unit ball in R-d, we obtain the following results (up to polylogarithmic factors). 1) We give a parallel algorithm obtaining optimization error C-opt with d(1/3) C-opt(-2/3) opt gradient oracle query depth and d(1/3)C(opt)(-2/3) + C-opt(-2) opt gradient queries in total, assuming access to a bounded-variance stochastic gradient estimator. For epsilon(opt) is an element of [d(-1), d(-1/4)], our algorithm matches the state-of-the-art oracle depth of [BJL(+) 19] while maintaining the optimal total work of stochastic gradient descent. 2) Given n samples of Lipschitz loss functions, prior works [BFTT19], [BFGT20], [AFKT21], [KLL21] established that if n greater than or similar to d epsilon(-2)(dp), (epsilon(dp), delta)-differential privacy is attained at no asymptotic cost to the SCO utility. However, these prior works all required a superlinear number of gradient queries. We close this gap for sufficiently large n greater than or similar to d2 epsilon(-3)(dp), by using ReSQue to design an algorithm with near-linear gradient query complexity in this regime.
更多
查看译文
关键词
stochastic optimization,parallel computation,differential privacy
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要