FedEED: Efficient Federated Distillation with Ensemble of Aggregated Models

ICLR 2023(2023)

引用 0|浏览4
暂无评分
摘要
In this paper, we study the key components of the knowledge distillation-based model aggregation in federated learning (FL). We first propose a generalized distillation framework where the process of federated distillation is divided into three key stages. By investigating the contributions of each stage, we introduce a new FL framework, named Federated Efficient Ensemble Distillation (FedEED), where the ensemble teacher is created based on aggregated models. Experiment results showed that FedEED outperforms the state-of-the-art methods, including FedAvg and FedDF, on the benchmark datasets. Besides performance, FedEED also demonstrated improved scalability and privacy when compared with existing distillation-based aggregation algorithms. In particular, FedEED does not require direct access to users' model, which can protect the users' privacy. Furthermore, due to the ensemble created by aggregated models, FedEED is highly scalable, and the asymmetric distillation scheme allows parallelism between server-side distillation and clients-side local training, which could speed up the training of large scale learning system.
更多
查看译文
关键词
Federated Learning,Knowledge Distillation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要