ResFed: Communication Efficient Federated Learning With Deep Compressed Residuals

IEEE Internet of Things Journal(2023)

引用 0|浏览4
暂无评分
摘要
Federated learning allows for cooperative training among distributed clients by sharing their locally learned model parameters, such as weights or gradients. However, as model size increases, the communication bandwidth required for deployment in wireless networks becomes a bottleneck. To address this, we propose a residual-based federated learning framework (ResFed) that transmits residuals instead of gradients or weights in networks. By predicting model updates at both clients and the server, residuals are calculated as the difference between updated and predicted models and contain more dense information than weights or gradients. We find that the residuals are less sensitive to an increasing compression ratio than other parameters, and hence use lossy compression techniques on residuals to improve communication efficiency for training in federated settings. With the same compression ratio, ResFed outperforms current methods (weight-or gradient-based federated learning) by over 1.4× on federated datasets, including MNIST, FashionMNIST, SVHN, CIFAR-10, CIFAR-100, FEMNIST, in client-to-server communication, and can also be applied to reduce communication costs for server-to-client communication.
更多
查看译文
关键词
federated learning,communication efficiency,deep compression,protocol design
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要