DFML: Decentralized Federated Mutual Learning
CoRR(2024)
摘要
In the realm of real-world devices, centralized servers in Federated Learning
(FL) present challenges including communication bottlenecks and susceptibility
to a single point of failure. Additionally, contemporary devices inherently
exhibit model and data heterogeneity. Existing work lacks a Decentralized FL
(DFL) framework capable of accommodating such heterogeneity without imposing
architectural restrictions or assuming the availability of public data. To
address these issues, we propose a Decentralized Federated Mutual Learning
(DFML) framework that is serverless, supports nonrestrictive heterogeneous
models, and avoids reliance on public data. DFML effectively handles model and
data heterogeneity through mutual learning, which distills knowledge between
clients, and cyclically varying the amount of supervision and distillation
signals. Extensive experimental results demonstrate consistent effectiveness of
DFML in both convergence speed and global accuracy, outperforming prevalent
baselines under various conditions. For example, with the CIFAR-100 dataset and
50 clients, DFML achieves a substantial increase of +17.20
global accuracy under Independent and Identically Distributed (IID) and non-IID
data shifts, respectively.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要