Semi-Structured Chain-of-Thought: Integrating Multiple Sources of Knowledge for Improved Language Model Reasoning
arxiv(2023)
摘要
An important open question in the use of large language models for
knowledge-intensive tasks is how to effectively integrate knowledge from three
sources: the model's parametric memory, external structured knowledge, and
external unstructured knowledge. Most existing prompting methods either rely on
one or two of these sources, or require repeatedly invoking large language
models to generate similar or identical content. In this work, we overcome
these limitations by introducing a novel semi-structured prompting approach
that seamlessly integrates the model's parametric memory with unstructured
knowledge from text documents and structured knowledge from knowledge graphs.
Experimental results on open-domain multi-hop question answering datasets
demonstrate that our prompting method significantly surpasses existing
techniques, even exceeding those that require fine-tuning.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要