A Condensed Transition Graph Framework for Zero-shot Link Prediction with Large Language Models
CoRR(2024)
摘要
Zero-shot link prediction (ZSLP) on knowledge graphs aims at automatically
identifying relations between given entities. Existing methods primarily employ
auxiliary information to predict tail entity given head entity and its
relation, yet face challenges due to the occasional unavailability of such
detailed information and the inherent simplicity of predicting tail entities
based on semantic similarities. Even though Large Language Models (LLMs) offer
a promising solution to predict unobserved relations between the head and tail
entity in a zero-shot manner, their performance is still restricted due to the
inability to leverage all the (exponentially many) paths' information between
two entities, which are critical in collectively indicating their relation
types. To address this, in this work, we introduce a Condensed Transition Graph
Framework for Zero-Shot Link Prediction (CTLP), which encodes all the paths'
information in linear time complexity to predict unseen relations between
entities, attaining both efficiency and information preservation. Specifically,
we design a condensed transition graph encoder with theoretical guarantees on
its coverage, expressiveness, and efficiency. It is learned by a transition
graph contrastive learning strategy. Subsequently, we design a soft instruction
tuning to learn and map the all-path embedding to the input of LLMs.
Experimental results show that our proposed CTLP method achieves
state-of-the-art performance on three standard ZSLP datasets
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要