Heterogeneous Federated Learning via Personalized Generative Networks
CoRR(2023)
摘要
Federated Learning (FL) allows several clients to construct a common global
machine-learning model without having to share their data. FL, however, faces
the challenge of statistical heterogeneity between the client's data, which
degrades performance and slows down the convergence toward the global model. In
this paper, we provide theoretical proof that minimizing heterogeneity between
clients facilitates the convergence of a global model for every single client.
This becomes particularly important under empirical concept shifts among
clients, rather than merely considering imbalanced classes, which have been
studied until now. Therefore, we propose a method for knowledge transfer
between clients where the server trains client-specific generators. Each
generator generates samples for the corresponding client to remove the conflict
with other clients' models. Experiments conducted on synthetic and real data,
along with a theoretical study, support the effectiveness of our method in
constructing a well-generalizable global model by reducing the conflict between
local models.
更多查看译文
关键词
heterogeneous federated learning,networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要