Self-Supervised Rule Learning to Link Text Segments to Relational Elements of Structured Knowledge.

Shajith Ikbal, Udit Sharma, Hima Karanam,Sumit Neelam,Ronny Luss, Dheeraj Sreedhar,Pavan Kapanipathi,Naweed Khan, Kyle Erwin, Ndivhuwo Makondo,Ibrahim Abdelaziz,Achille Fokoue, Alexander Gray,Maxwell Crouse,Subhajit Chaudhury, Chitra Subramanian

EMNLP 2023(2023)

引用 0|浏览38
暂无评分
摘要
We present a neuro-symbolic approach to self-learn rules that serve as interpretable knowledge to perform relation linking in knowledge base question answering systems. These rules define natural language text predicates as a weighted mixture of knowledge base paths. The weights learned during training effectively serve the mapping needed to perform relation linking. We use popular masked training strategy to self-learn the rules. A key distinguishing aspect of our work is that the masked training operate over logical forms of the sentence instead of their natural language text form. This offers opportunity to extract extended context information from the structured knowledge source and use that to build robust and human readable rules. We evaluate accuracy and usefulness of such learned rules by utilizing them for prediction of missing kinship relation in CLUTRR dataset and relation linking in a KBQA system using SWQ-WD dataset. Results demonstrate the effectiveness of our approach - its generalizability, interpretability and ability to achieve an average performance gain of 17% on CLUTRR dataset.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要