Encouraging Lexical Translation Consistency for Document-Level Neural Machine Translation.

EMNLP(2021)

引用 9|浏览22
暂无评分
摘要
Recently a number of approaches have been proposed to improve translation performance for document-level neural machine translation (NMT). However, few are focusing on the subject of lexical translation consistency. In this paper we apply "one translation per discourse" in NMT, and aim to encourage lexical translation consistency for document-level NMT. This is done by first obtaining a word link for each source word in a document, which tells the positions where the source word appears at. Then we encourage the translations of those words within a link to be consistent in two ways. On the one hand, when encoding sentences within a document we properly exchange context information of those words. On the other hand, we propose an auxiliary loss function to better constrain that their translations should be consistent. Experimental results on Chinese <-> English and English -> French translation tasks show that our approach not only achieves state-of-the-art performance in BLEU scores, but also greatly improves lexical translation consistency.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要