Communication-Efficient Distributed Learning with Local Immediate Error Compensation
CoRR(2024)
摘要
Gradient compression with error compensation has attracted significant
attention with the target of reducing the heavy communication overhead in
distributed learning. However, existing compression methods either perform only
unidirectional compression in one iteration with higher communication cost, or
bidirectional compression with slower convergence rate. In this work, we
propose the Local Immediate Error Compensated SGD (LIEC-SGD) optimization
algorithm to break the above bottlenecks based on bidirectional compression and
carefully designed compensation approaches. Specifically, the bidirectional
compression technique is to reduce the communication cost, and the compensation
technique compensates the local compression error to the model update
immediately while only maintaining the global error variable on the server
throughout the iterations to boost its efficacy. Theoretically, we prove that
LIEC-SGD is superior to previous works in either the convergence rate or the
communication cost, which indicates that LIEC-SGD could inherit the dual
advantages from unidirectional compression and bidirectional compression.
Finally, experiments of training deep neural networks validate the
effectiveness of the proposed LIEC-SGD algorithm.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要