Adaptive Gradient Quantization For Data-Parallel Sgd

ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS (NEURIPS 2020)(2020)

引用 61|浏览120
暂无评分
摘要
Many communication-efficient variants of SGD use gradient quantization schemes. These schemes are often heuristic and fixed over the course of training. We empirically observe that the statistics of gradients of deep models change during the training. Motivated by this observation, we introduce two adaptive quantization schemes, ALQ and AMQ. In both schemes, processors update their compression schemes in parallel by efficiently computing sufficient statistics of a parametric distribution. We improve the validation accuracy by almost 2% on CIFAR-10 and 1% on ImageNet in challenging low-cost communication setups. Our adaptive methods are also significantly more robust to the choice of hyperparameters.
更多
查看译文
关键词
adaptive gradient quantization,data-parallel
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要