To Translate or Not to Translate: A Systematic Investigation of Translation-Based Cross-Lingual Transfer to Low-Resource Languages.
CoRR(2023)
摘要
Perfect machine translation (MT) would render cross-lingual transfer (XLT) by
means of multilingual language models (LMs) superfluous. Given, on one hand,
the large body of work on improving XLT with multilingual LMs and, on the other
hand, recent advances in massively multilingual MT, in this work, we
systematically evaluate existing and propose new translation-based XLT
approaches for transfer to low-resource languages. We show that all
translation-based approaches dramatically outperform zero-shot XLT with
multilingual LMs, rendering the approach that combines the round-trip
translation of the source-language training data with the translation of the
target-language test instances the most effective. We next show that one can
obtain further empirical gains by adding reliable translations to other
high-resource languages to the training data. Moreover, we propose an effective
translation-based XLT strategy even for languages not supported by the MT
system. Finally, we show that model selection for XLT based on target-language
validation data obtained with MT outperforms model selection based on the
source-language data. We hope that our findings encourage adoption of more
robust translation-based baselines in XLT research.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要