Dynamic Knowledge Graph Completion with Jointly Structural and Textual Dependency

international conference on algorithms and architectures for parallel processing(2020)

引用 1|浏览11
暂无评分
摘要
Knowledge Graph Completion (KGC) aims to fill the missing facts in Knowledge Graphs (KGs). Due to the most real-world KGs evolve quickly with new entities and relations being added by the minute, the dynamic KGC task is more practical than static KGC task because it can be easily to scale up the KGs by add new entities and relations. Most existing dynamic KGC models are ignore the dependency between multi-source information and topology-structure so that they lose very much semantic information in KGs. In this paper, we proposed a novel dynamic KGC model with jointly structural and textual dependency based on deep recurrent neural network (DKGC-JSTD). This model learns embedding of entity’s name and parts of its text-description to connect unseen entities to KGs. In order to establish the relevance between text description information and topology information, DKGC-JSTD uses deep memory network and association matching mechanism to extract relevant semantic feature information between entity and relations from entity text-description. And then using deep recurrent neural network to model the dependency between topology-structure and text-description. Experiments on large data sets, both old and new, show that DKGC-JSTD performs well in the dynamic KGC task.
更多
查看译文
关键词
dynamic knowledge graph completion,jointly structural,textual
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要