Multilingual Denoising Pre-training for Neural Machine Translation.

Transactions of the Association for Computational Linguistics(2020)

引用 1403|浏览1134
暂无评分
摘要
This paper demonstrates that multilingual denoising pre-training produces significant performance gains across a wide variety of machine translation (MT) tasks. We present mBART—a sequence-to-seque...
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络