Communication Efficient Decentralized Training with Multiple Local Updates

arxiv(2019)

引用 0|浏览42
暂无评分
摘要
Communication efficiency plays a significant role in decentralized optimization, especially when the data is highly non-identically distributed. In this paper, we propose a novel algorithm that we call Periodic Decentralized SGD (PD-SGD), to reduce the communication cost in a decentralized heterogeneous network. PD-SGD alternates between multiple local updates and multiple decentralized communications, making communication more flexible and controllable. We theoretically prove PD-SGD convergence at speed $O(\frac{1}{\sqrt{nT}})$ under the setting of stochastic non-convex optimization and non-i.i.d. data where $n$ is the number of worker nodes. We also propose a novel decay strategy which periodically shrinks the length of local updates. PD-SGD equipped with this strategy can better balance the communication-convergence trade-off both theoretically and empirically.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要