FedClust: Optimizing Federated Learning on Non-IID Data through Weight-Driven Client Clustering
arxiv(2024)
摘要
Federated learning (FL) is an emerging distributed machine learning paradigm
enabling collaborative model training on decentralized devices without exposing
their local data. A key challenge in FL is the uneven data distribution across
client devices, violating the well-known assumption of
independent-and-identically-distributed (IID) training samples in conventional
machine learning. Clustered federated learning (CFL) addresses this challenge
by grouping clients based on the similarity of their data distributions.
However, existing CFL approaches require a large number of communication rounds
for stable cluster formation and rely on a predefined number of clusters, thus
limiting their flexibility and adaptability. This paper proposes FedClust, a
novel CFL approach leveraging correlations between local model weights and
client data distributions. FedClust groups clients into clusters in a one-shot
manner using strategically selected partial model weights and dynamically
accommodates newcomers in real-time. Experimental results demonstrate FedClust
outperforms baseline approaches in terms of accuracy and communication costs.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要