Pre Training Transformers as Energy Based Cloze Models

EMNLP 2020, pp. 285-294, 2020.

Cited by: 0|Bibtex|Views166|Links

Abstract:

We introduce Electric, an energy-based cloze model for representation learning over text. Like BERT, it is a conditional generative model of tokens given their contexts. However, Electric does not use masking or output a full distribution over tokens that could occur in a context. Instead, it assigns a scalar energy score to each input to...More

Code:

Data:

Your rating :
0

 

Tags
Comments