CharSpan: Utilizing Lexical Similarity to Enable Zero-Shot Machine Translation for Extremely Low-resource Languages
CoRR(2023)
摘要
We address the task of machine translation (MT) from extremely low-resource
language (ELRL) to English by leveraging cross-lingual transfer from
'closely-related' high-resource language (HRL). The development of an MT system
for ELRL is challenging because these languages typically lack parallel corpora
and monolingual corpora, and their representations are absent from large
multilingual language models. Many ELRLs share lexical similarities with some
HRLs, which presents a novel modeling opportunity. However, existing
subword-based neural MT models do not explicitly harness this lexical
similarity, as they only implicitly align HRL and ELRL latent embedding space.
To overcome this limitation, we propose a novel, CharSpan, approach based on
'character-span noise augmentation' into the training data of HRL. This serves
as a regularization technique, making the model more robust to 'lexical
divergences' between the HRL and ELRL, thus facilitating effective
cross-lingual transfer. Our method significantly outperformed strong baselines
in zero-shot settings on closely related HRL and ELRL pairs from three diverse
language families, emerging as the state-of-the-art model for ELRLs.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要