Well-Written Knowledge Graphs: Most Effective RDF Syntaxes for Triple Linearization in End-to-End Extraction of Relations from Texts (Student Abstract)

Célian Ringwald,Fabien Gandon, Catherine Faron,Franck Michel, Hanna Abi Akl

AAAI 2024(2024)

引用 0|浏览0
暂无评分
摘要
Seq-to-seq generative models recently gained attention for solving the relation extraction task. By approaching this problem as an end-to-end task, they surpassed encoder-based-only models. Little research investigated the effects of the output syntaxes on the training process of these models. Moreover, a limited number of approaches were proposed for extracting ready-to-load knowledge graphs following the RDF standard. In this paper, we consider that a set of triples can be linearized in many different ways, and we evaluate the combined effect of the size of the language models and different RDF syntaxes on the task of relation extraction from Wikipedia abstracts.
更多
查看译文
关键词
DMKM: Linked Open Data Knowledge Graphs & KB Completion,NLP: Information Extraction,DMKM: Knowledge Acquisition From The Web,NLP: Large Language Models
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要