Multilingual Denoising Pre-training for Neural Machine Translation.
Transactions of the Association for Computational Linguistics(2020)
摘要
This paper demonstrates that multilingual denoising pre-training produces significant performance gains across a wide variety of machine translation (MT) tasks. We present mBART—a sequence-to-seque...
更多查看译文
关键词
neural machine translation,pre-training
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络