Simple and Effective Regularization Methods for Training on Noisily Labeled Data with Generalization Guarantee

ICLR, 2020.

Cited by: 0|Views54
EI
Weibo:
Towards understanding generalization of deep neural networks in presence of noisy labels, this paper presents two simple regularization methods and shows that they are theoretically and empirically effective

Abstract:

Over-parameterized deep neural networks trained by simple first-order methods are known to be able to fit any labeling of data. Such over-fitting ability hinders generalization when mislabeled training examples are present. On the other hand, simple regularization methods like early-stopping can often achieve highly nontrivial performance...More
Your rating :
0

 

Tags
Comments