Explicit Alignment Objectives for Multilingual Bidirectional Encoders

Cited by: 0|Bibtex|Views11|Links

Abstract:

Pre-trained cross-lingual encoders such as mBERT (Devlin et al., 2019) and XLMR (Conneau et al., 2020) have proven to be impressively effective at enabling transfer-learning of NLP systems from high-resource languages to low-resource languages. This success comes despite the fact that there is no explicit objective to align the contextu...More

Code:

Data:

Your rating :
0

 

Tags
Comments