Compressed Differentially Private Distributed Optimization with Linear Convergence

IFAC PAPERSONLINE(2023)

引用 0|浏览16
暂无评分
摘要
This paper addresses the problem of differentially private distributed optimization under limited communication, where each agent aims to keep their cost function private while minimizing the sum of all agents' cost functions. In response, we propose a novel Compressed differentially Private distributed Gradient Tracking algorithm (CPGT). We demonstrate that CPGT achieves linear convergence for smooth and strongly convex cost functions, even with a class of biased but contractive compressors, and achieves the same accuracy as the idealized communication algorithm. Additionally, we rigorously prove that CPGT ensures differential privacy. Simulations are provided to validate the effectiveness of the proposed algorithm. Copyright (c) 2023 The Authors. This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/)
更多
查看译文
关键词
Compression communication,distributed optimization,differential privacy,linear convergence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要