ElastiCL: Elastic Quantization for Communication Efficient Collaborative Learning in IoT

Embedded Network Sensor Systems(2021)

引用 6|浏览10
暂无评分
摘要
ABSTRACTTransmitting updates of high-dimensional models between client IoT devices and the central aggregating server has always been a bottleneck in collaborative learning - especially in uncertain real-world IoT networks where congestion, latency, bandwidth issues are common. In this scenario, gradient quantization is an effective way to reduce bits count when transmitting each model update, but with a trade-off of having an elevated error floor due to higher variance of the stochastic gradients. In this paper, we propose ElastiCL, an elastic quantization strategy that achieves transmission efficiency plus a low error floor by dynamically altering the number of quantization levels during training on distributed IoT devices. Experiments on training ResNet-18, Vanilla CNN shows that ElastiCL can converge in much fewer transmitted bits than fixed quantization level, with little or no compromise on training and test accuracy.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要