Se^2: Sequential Example Selection for In-Context Learning
arxiv(2024)
摘要
The remarkable capability of large language models (LLMs) for in-context
learning (ICL) needs to be activated by demonstration examples. Prior work has
extensively explored the selection of examples for ICL, predominantly following
the "select then organize" paradigm, such approaches often neglect the internal
relationships between examples and exist an inconsistency between the training
and inference. In this paper, we formulate the problem as a
sequential selection problem and introduce Se^2, a
sequential-aware method that leverages the LLM's feedback on varying context,
aiding in capturing inter-relationships and sequential information among
examples, significantly enriching the contextuality and relevance of ICL
prompts. Meanwhile, we utilize beam search to seek and construct example
sequences, enhancing both quality and diversity. Extensive experiments across
23 NLP tasks from 8 distinct categories illustrate that Se^2 markedly
surpasses competitive baselines and achieves 42
random selection. Further in-depth analysis show the effectiveness of proposed
strategies, highlighting Se^2's exceptional stability and adaptability across
various scenarios. Our code will be released to facilitate future research.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要