OAG$_{\mathrm {know}}$ : Self-Supervised Learning for Linking Knowledge Graphs

IEEE Transactions on Knowledge and Data Engineering(2023)

Cited 7|Views41514
No score
Abstract
We propose a self-supervised embedding learning framework—SelfLinKG—to link concepts in heterogeneous knowledge graphs. Without any labeled data, SelfLinKG can achieve competitive performance against its supervised counterpart, and significantly outperforms state-of-the-art unsupervised methods by 26%-50% under linear classification protocol. The essential components of SelfLinKG are local attention-based encoding and momentum contrastive learning. The former aims to learn the graph representation using an attention network, while the latter is to learn a self-supervised model across knowledge graphs using contrastive learning. SelfLinKG has been deployed to build the the new version, called OAG $_{\mathrm {know}}$ of Open Academic Graph (OAG). All data and codes are publicly available.
More
Translated text
Key words
Concept linking,self-supervised learning,contrastive learning,knowledge base
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined