Overcoming Noisy Labels in Federated Learning Through Local Self-Guiding

2023 IEEE/ACM 23rd International Symposium on Cluster, Cloud and Internet Computing (CCGrid)(2023)

引用 0|浏览11
暂无评分
摘要
Federated Learning (FL) is a privacy-preserving machine learning paradigm that enables clients such as Internet of Things (IoT) devices, and smartphones, to train a high-performance global model jointly. However, in real-world FL deployments, carefully human-annotated labels are expensive and time-consuming. So the presence of incorrect labels (noisy labels) in the local training data of the clients is inevitable, which will cause the performance degradation of the global model. To tackle this problem, we propose a simple but effective method Local Self-Guiding (LSG) to let clients guide themselves during training in the presence of noisy labels. Specifically, LSG keeps the model from memorizing noisy labels by enhancing the confidence of model predictions. Meanwhile, it utilizes the knowledge from local historical models which haven't fit noisy patterns to extract potential ground truth labels of samples. To keep the knowledge without storing models, LSG records the exponential moving average (EMA) of model output logits at different local training epochs as self-ensemble logits on clients' devices, which will lead to negligible computation and storage overhead. Then logit-based knowledge distillation is conducted to guide the local training. Experiments on MNIST, Fashion-MNIST, CIFAR-10, ImageNet-100 with multiple noise levels, and an unbalanced noisy dataset, Clothing1M, demonstrate the resistance of LSG to noisy labels. The code of LSG is available at https://github.com/DaokuanBai/LSG-Main
更多
查看译文
关键词
Federated learning, data with noisy labels, robustness
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要