Cross-Lingual Transfer for Natural Language Inference via Multilingual Prompt Translator
CoRR(2024)
Abstract
Based on multilingual pre-trained models, cross-lingual transfer with prompt
learning has shown promising effectiveness, where soft prompt learned in a
source language is transferred to target languages for downstream tasks,
particularly in the low-resource scenario. To efficiently transfer soft prompt,
we propose a novel framework, Multilingual Prompt Translator (MPT), where a
multilingual prompt translator is introduced to properly process crucial
knowledge embedded in prompt by changing language knowledge while retaining
task knowledge. Concretely, we first train prompt in source language and employ
translator to translate it into target prompt. Besides, we extend an external
corpus as auxiliary data, on which an alignment task for predicted answer
probability is designed to convert language knowledge, thereby equipping target
prompt with multilingual knowledge. In few-shot settings on XNLI, MPT
demonstrates superiority over baselines by remarkable improvements. MPT is more
prominent compared with vanilla prompting when transferring to languages quite
distinct from source language.
MoreTranslated text
AI Read Science
Must-Reading Tree
Example
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined