Federated Skewed Label Learning with Logits Fusion.
CoRR(2023)
摘要
Federated learning (FL) aims to collaboratively train a shared model across
multiple clients without transmitting their local data. Data heterogeneity is a
critical challenge in realistic FL settings, as it causes significant
performance deterioration due to discrepancies in optimization among local
models. In this work, we focus on label distribution skew, a common scenario in
data heterogeneity, where the data label categories are imbalanced on each
client. To address this issue, we propose FedBalance, which corrects the
optimization bias among local models by calibrating their logits. Specifically,
we introduce an extra private weak learner on the client side, which forms an
ensemble model with the local model. By fusing the logits of the two models,
the private weak learner can capture the variance of different data, regardless
of their category. Therefore, the optimization direction of local models can be
improved by increasing the penalty for misclassifying minority classes and
reducing the attention to majority classes, resulting in a better global model.
Extensive experiments show that our method can gain 13\% higher average
accuracy compared with state-of-the-art methods.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要