Consprompt: Exploiting Contrastive Samples for Few-Shot Prompt Learning

ICASSP 2024 - 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)(2024)

引用 0|浏览0
暂无评分
摘要
Prompt has become an effective linguistic tool for utilizing pre-trained language models. However, in few-shot scenarios, subtle changes of prompt’s design always make the result widely different, and the prompt learning methods are also easy to overfit the limited samples. To alleviate this, we explore utilizing suitable contrastive samples and multi-degree contrastive learning methods to improve the robustness of prompt’s representation. Therefore, the proposed Consprompt combined with prompt encoding network, contrastive sampling modules, and contrastive scoring modules, is introduced to realize differential contrastive learning. Our results exhibit the state-of-the-art performance in different few-shot settings, and the ablation experiments also certify the effectiveness of utilizing multi-degree contrastive learning in prompt-based fine-tuning process.
更多
查看译文
关键词
Prompt learning,Pre-trained language model,contrastive learning,few-shot learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要