AGCVT-prompt for sentiment classification: Automatically generating chain of thought and verbalizer in prompt learning

ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE(2024)

引用 0|浏览4
暂无评分
摘要
Large language models (LLMs) have revolutionized natural language processing, but they require significant data and hardware resources. Prompt learning offers a solution by enabling a single model for multiple downstream tasks. However, current prompt learning methods rely on costly prompt templates for training. This is a challenge for tasks like sentiment classification, where high-quality templates are hard to create and pseudo-token composed templates can be expensive to train. Recent studies on the chain of thought (COT) have shown that enhancing the presentation of certain aspects of the reasoning process can improve the performance of LLMs. With this in mind, this research introduces the auto-generated COT and verbalizer templates (AGCVT-Prompt) technique, which clusters unlabeled texts according to their identified topic and sentiment. Subsequently, it generates dual verbalizers and formulates both topic and sentiment prompt templates, utilizing the categories discerned within the text and verbalizers. This method significantly improves the transparency and interpretability of the model's decision-making processes. The AGCVT-Prompt technique was evaluated against conventional prompt learning and advanced sentiment classification methods, using state-of-the-art LLMs on both Chinese and English datasets. The results showed superior performance in all evaluations. Specifically, the AGCVT-Prompt method outperformed previous prompt learning techniques in few-shot learning scenarios, providing higher zero-shot and few-shot learning capabilities. Additionally, AGCVT-Prompt was utilized to analyze network comments about Corona Virus Disease 2019, providing valuable insights. These findings indicate that AGCVT-Prompt is a promising alternative for sentiment classification tasks, particularly in situations where labeled data is scarce.
更多
查看译文
关键词
Large language models,Prompt learning,Sentiment classification,Chain of thought
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要