Multilingual Denoising Pre-training for Neural Machine Translation.
Transactions of the Association for Computational Linguistics(2020)
Abstract
This paper demonstrates that multilingual denoising pre-training produces significant performance gains across a wide variety of machine translation (MT) tasks. We present mBART—a sequence-to-seque...
MoreTranslated text
Key words
neural machine translation,pre-training
AI Read Science
Must-Reading Tree
Example
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined