Federated Data Quality Assessment Approach: Robust Learning With Mixed Label Noise

IEEE transactions on neural networks and learning systems(2023)

引用 0|浏览18
暂无评分
摘要
Federated learning (FL) has been an effective way to train a machine learning model distributedly, holding local data without exchanging them. However, due to the inaccessibility of local data, FL with label noise would be more challenging. Most existing methods assume only open-set or closed-set noise and correspondingly propose filtering or correction solutions, ignoring that label noise can be mixed in real-world scenarios. In this article, we propose a novel FL method to discriminate the type of noise and make the FL mixed noise-robust, named FedMIN. FedMIN employs a composite framework that captures local-global differences in multiparticipant distributions to model generalized noise patterns. By determining adaptive thresholds for identifying mixed label noise in each client and assigning appropriate weights during model aggregation, FedMIN enhances the performance of the global model. Furthermore, FedMIN incorporates a loss alignment mechanism using local and global Gaussian mixture models (GMMs) to mitigate the risk of revealing samplewise loss. Extensive experiments are conducted on several public datasets, which include the simulated FL testbeds, i.e., CIFAR-10, CIFAR-100, and SVHN, and the real-world ones, i.e., Camelyon17 and multiorgan nuclei challenge (MoNuSAC). Compared to FL benchmarks, FedMIN improves model accuracy by up to 9.9% due to its superior noise estimation capabilities.
更多
查看译文
关键词
Noise measurement,Servers,Task analysis,Adaptation models,Data models,Data integrity,Computers,Data quality assessment,federated learning (FL),noise-robust algorithm
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要