FAT-RE: A faster dependency-free model for relation extraction

Journal of Web Semantics(2020)

引用 6|浏览39
暂无评分
摘要
Recent years have seen the dependency tree as effective information for relation extraction. Two problems still exist in previous methods: (1) dependency tree relies on external tools and needs to be carefully integrated with a trade-off between pruning noisy words and keeping semantic integrity; (2) dependency-based methods still have to encode sequential context as a supplement, which needs extra time. To tackle the two problems, we propose a faster dependency-free model in this paper: regarding the sentence as a fully-connected graph, we customize the vanilla transformer architecture to remove the irrelevant information via filtering mechanism and further aggregate the sentence information through the enhanced query. Our model yields comparable results on the SemEval2010 Task8 dataset and better results on the TACRED dataset, without requiring external information from the dependency tree but with improved time efficiency.
更多
查看译文
关键词
Relation extraction,Filtering,Aggregation,Transformer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要