Generative adversarial networks based on Wasserstein distance for knowledge graph embeddings

Knowledge-Based Systems(2020)

引用 43|浏览31
暂无评分
摘要
Knowledge graph embedding aims to project entities and relations into low-dimensional and continuous semantic feature spaces, which has captured more attention in recent years. Most of the existing models roughly construct negative samples via a uniformly random mode, by which these corrupted samples are practically trivial for training the embedding model. Inspired by generative adversarial networks (GANs), the generator can be employed to sample more plausible negative triplets, that boosts the discriminator to improve its embedding performance further. However, vanishing gradient on discrete data is an inherent problem in traditional GANs. In this paper, we propose a generative adversarial network based knowledge graph representation learning model by introducing the Wasserstein distance to replace traditional divergence for settling this issue. Moreover, the additional weak supervision information is also absorbed to refine the performance of embedding model since these textual information contains detailed semantic description and offers abundant semantic relevance. In the experiments, we evaluate our method on the tasks of link prediction and triplet classification. The experimental results indicate that the Wasserstein distance is capable of solving the problem of vanishing gradient on discrete data and accelerating the convergence, additional weak supervision information also can significantly improve the performance of the model.
更多
查看译文
关键词
Knowledge graph embedding,Generative adversarial networks,Wasserstein distance,Weak supervision information
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要