Multilingual Denoising Pre-training for Neural Machine Translation.

Transactions of the Association for Computational Linguistics(2020)

Cited 1645|Views1229
No score
Abstract
This paper demonstrates that multilingual denoising pre-training produces significant performance gains across a wide variety of machine translation (MT) tasks. We present mBART—a sequence-to-seque...
More
Translated text
Key words
neural machine translation,pre-training
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined