Dynamic Sampling and Selective Masking for Communication-Efficient Federated Learning

IEEE Intelligent Systems(2022)

引用 36|浏览58
暂无评分
摘要
Federated learning (FL) is a novel machine learning setting that enables on-device intelligence via decentralized training and federated optimization. Deep neural networks’ rapid development facilitates the learning techniques for modeling complex problems and emerges into federated deep learning under the federated setting. However, the tremendous amount of model parameters burdens the communication network with a high load of transportation. This article introduces two approaches for improving communication efficiency by dynamic sampling and top-$k$k selective masking. The former controls the fraction of selected client models dynamically, while the latter selects parameters with top-$k$k largest values of difference for federated updating. Experiments on convolutional image classification and recurrent language modeling are conducted on three public datasets to show our proposed methods’ effectiveness.
更多
查看译文
关键词
dynamic sampling,selective masking,communication-efficient federated learning,on-device intelligence,decentralized training,deep neural networks,complex problems,federated deep learning,federated setting,communication network,communication efficiency,client models,federated updating,recurrent language modeling
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要