Meta-Probability Weighting for Improving Reliability of DNNs to Label Noise

IEEE Journal of Biomedical and Health Informatics(2023)

引用 1|浏览36
暂无评分
摘要
Training noise-robust deep neural networks (DNNs) in label noise scenario is a crucial task. In this paper, we first demonstrates that the DNNs learning with label noise exhibits over-fitting issue on noisy labels because of the DNNs is too confidence in its learning capacity. More significantly, however, it also potentially suffers from under-learning on samples with clean labels. DNNs essentially should pay more attention on the clean samples rather than the noisy samples. Inspired by the sample-weighting strategy, we propose a meta-probability weighting (MPW) algorithm which re-weights the output probability of DNNs to prevent DNNs from over-fitting to label noise and alleviate the under-learning issue on the clean sample. MPW conducts an approximation optimization to adaptively learn the probability weights from data under the supervision of a small clean dataset, and achieves iterative optimization between probability weights and network parameters via meta-learning paradigm. The ablation studies substantiate the effectiveness of MPW to prevent the deep neural networks from overfitting to label noise and improve the learning capacity on clean samples. Furthermore, MPW achieves competitive performance with other state-of-the-art methods on both synthetic and real-world noises.
更多
查看译文
关键词
Label noise,overfit,robust learning,meta-learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要