ACT-MNMT Auto-Constriction Turning for Multilingual Neural Machine Translation

Shaojie Dai,Xin Liu,Ping Luo,Yue Yu

arxiv(2024)

引用 0|浏览5
暂无评分
摘要
Large language model (LLM) has achieved promising performance in multilingual machine translation tasks through zero/few-shot prompts or prompt-tuning. However, due to the mixture of multilingual data during the pre-training of LLM, the LLM-based translation models face the off-target issue in both prompt-based methods, including a series of phenomena, namely instruction misunderstanding, translation with wrong language and over-generation. For this issue, this paper introduces an Auto-Constriction Turning mechanism for Multilingual Neural Machine Translation (), which is a novel supervised fine-tuning mechanism and orthogonal to the traditional prompt-based methods. In this method, automatically constructs a constrained template in the target side by adding trigger tokens ahead of the ground truth. Furthermore, trigger tokens can be arranged and combined freely to represent different task semantics, and they can be iteratively updated to maximize the label likelihood. Experiments are performed on WMT test sets with multiple metrics, and the experimental results demonstrate that achieves substantially improved performance across multiple translation directions and reduce the off-target phenomena in the translation.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要