Revisiting Self-Training for Neural Sequence Generation

ICLR, 2020.

Cited by: 44|Views88
EI
Weibo:
Experiments on machine translation and text summarization demonstrate the effectiveness of this approach in both low and high resource settings

Abstract:

Self-training is one of the earliest and simplest semi-supervised methods. The key idea is to augment the original labeled dataset with unlabeled data paired with the model's prediction (i.e. the pseudo-parallel data). While self-training has been extensively studied on classification problems, in complex sequence generation tasks (e.g. m...More

Code:

Data:

Your rating :
0

 

Tags
Comments