Can we Soft Prompt LLMs for Graph Learning Tasks?
arxiv(2024)
摘要
Graph plays an important role in representing complex relationships in
real-world applications such as social networks, biological data and citation
networks. In recent years, Large Language Models (LLMs) have achieved
tremendous success in various domains, which makes applying LLMs to graphs
particularly appealing. However, directly applying LLMs to graph modalities
presents unique challenges due to the discrepancy and mismatch between the
graph and text modalities. Hence, to further investigate LLMs' potential for
comprehending graph information, we introduce GraphPrompter, a novel framework
designed to align graph information with LLMs via soft prompts. Specifically,
GraphPrompter consists of two main components: a graph neural network to encode
complex graph information and an LLM that effectively processes textual
information. Comprehensive experiments on various benchmark datasets under node
classification and link prediction tasks demonstrate the effectiveness of our
proposed method. The GraphPrompter framework unveils the substantial
capabilities of LLMs as predictors in graph-related tasks, enabling researchers
to utilize LLMs across a spectrum of real-world graph scenarios more
effectively.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要