ERA-CoT: Improving Chain-of-Thought through Entity Relationship Analysis
arxiv(2024)
摘要
Large language models (LLMs) have achieved commendable accomplishments in
various natural language processing tasks. However, LLMs still encounter
significant challenges when dealing with complex scenarios involving multiple
entities. These challenges arise from the presence of implicit relationships
that demand multi-step reasoning. In this paper, we propose a novel approach
ERA-CoT, which aids LLMs in understanding context by capturing relationships
between entities and supports the reasoning of diverse tasks through
Chain-of-Thoughts (CoT). Experimental results show that ERA-CoT demonstrates
the superior performance of our proposed method compared to current CoT
prompting methods, achieving a significant improvement of an average of 5.1%
on GPT3.5 compared to previous SOTA baselines. Our analysis indicates that
ERA-CoT increases the LLM's understanding of entity relationships,
significantly improves the accuracy of question answering, and enhances the
reasoning ability of LLMs.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要