Knowledge graph attention mechanism for distant supervision neural relation extraction

Knowledge-Based Systems(2022)

引用 1|浏览5
暂无评分
摘要
Distant supervision neural relation extraction is used to classify the relation labels of instance bags with the same head–tail entity. Since there are usually multirelation labels in an entity pair, intensive noise occurs in the instance bag. Recent works mostly focused on developing an attention mechanism to degrade the weights of some sentences whose ground-truth relation labels are different from the instance bag. However, the general weakness is the failure to explore the semantic correlations between entity pair and its context. Additionally, the number of relation categories follows a long-tail distribution, and it is still a challenge to extract long-tail relations. Therefore, the Knowledge Graph ATTention (KGATT) mechanism is proposed to deal with the noises and long-tail problem, and it contains two modules: a fine-alignment mechanism and an inductive mechanism. In particular, the fine-alignment mechanism is used to learn the ground-truth relation of the sentence itself by aligning with all predefined relations. The inductive mechanism is employed to learn the enhanced relations from neighbors of the knowledge graph (KG) to compensate for data scarcity. With the mutual reinforcement of the two modules, our model can enrich the representation of the instance bag, which not only improves the generalization ability but also eases the long-tail phenomenon. Extensive experiments and ablation studies are conducted on the NYT-FB60K and GIDS-FB8K datasets, and the results show the KGATT is effective in improving performance. Based on Piecewise Convolution Neural Network (PCNN), our model achieves superior performances in various indicators as well as the long-tail relations.
更多
查看译文
关键词
Knowledge graph,Long-tail phenomenon,Fine-alignment mechanism,Inductive mechanism
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要