A Quantitative Survey of Communication Optimizations in Distributed Deep Learning

IEEE Network(2021)

引用 39|浏览15
暂无评分
摘要
Nowadays, large and complex deep learning (DL) models are increasingly trained in a distributed manner across multiple worker machines, in which extensive communications between workers pose serious scaling problems. In this article, we present a quantitative survey of communication optimization techniques for data parallel distributed DL. We first identify the major communication challenges and c...
更多
查看译文
关键词
Training,Computational modeling,Distributed databases,Parallel processing,Data models,Tensors,Task analysis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要