FedSL: A Communication Efficient Federated Learning With Split Layer Aggregation

IEEE Internet of Things Journal(2024)

引用 0|浏览3
暂无评分
摘要
Federated learning can train a model collaboratively through multiple remote clients without sharing raw data. The challenge of federated learning is how to decrease network transmissions. This paper aims to reduce network traffic by transmitting fewer neural network parameters. We first investigate similarities of different corresponding layers of convolutional neural network (CNN) models in federated learning, and find that there is a lot of redundant information in its model feature extractors. For this, we propose a communication-efficient federated aggregation algorithm named FedSL (Federated Split Layers) to reduce the communication overhead. Based on the number of global model layers, the FedSL divides client models into groups in the depth dimension. A Max-Min client selection strategy is employed to select participants for each layer. Each client only transfers partial parameters of those layers that are selected, which reduces the number of parameters. FedSL aggregates the global model in each group and concatenates the parameters of all groups according to the order of layers. The experimental results demonstrate that FedSL improves communication efficiency compared to the algorithms (e.g., FedAvg, FedProx, MOON), decreasing 42% communication cost with VGG-style CNN and 70% with ResNet-9, while maintaining a similar model accuracy with baseline algorithms.
更多
查看译文
关键词
Federated learning,communication cost,client selection,split aggregation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要