Fair large kernel embedding with relation-specific features extraction for link prediction

Information Sciences(2024)

引用 0|浏览0
暂无评分
摘要
Knowledge graph embedding is a crucial technique for addressing the challenge of incomplete knowledge graphs, and convolutional neural networks are widely applied in this domain. However, convolutional neural networks face limitations in effectively capturing long-range dependencies between entities and relations in feature maps. Moreover, existing knowledge graph embedding models often adopt a uniform perspective on all relations within a knowledge graph, thereby overlooking the relation-specific features. To address the two issues, two novel knowledge graph embedding models, Large Kernel Embedding (LKE) and Large Kernel Embedding with Relation-specific Features Extraction (LKER), are proposed. The effectiveness of knowledge graph embedding is improved by integrating fair large kernel attention and introducing a parallel branch for relation-specific feature extraction in these models. Specifically, fair large kernel attention is incorporated into LKE, capturing long-range dependencies within the model. Based on LKE, LKER adds a parallel relationship-specific feature extraction branch to obtain more comprehensive relationship-specific feature information. The proposed models are extensively evaluated on five benchmark datasets, and the results show that the proposed models yield significant performance improvements in link prediction compared to classical and latest knowledge graph embedding models.
更多
查看译文
关键词
Knowledge graph embedding,Large kernel attention,Knowledge graph,Link prediction
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要