Grounding proposition stores for question answering over linked data.

Knowledge-Based Systems(2017)

引用 7|浏览0
暂无评分
摘要
Grounding natural language is required for question answering but the contribution remains unmeasured.We provide a standalone evaluation and propose a method to ground propositions into a knowledge base.Results show how grounding accounts for 78.6.Simple lexical expansion can improve the results from 0.8. Grounding natural language utterances into semantic representations is crucial for tasks such as question answering and knowledge base population. However, the importance of the lexicons that are central to this mapping remains unmeasured because question answering systems are evaluated as end-to-end systems.This article proposes a methodology to enable a standalone evaluation of grounding natural language propositions into semantic relations by fixing all the components of a question answering system other than the lexicon itself. Thus, we can explore different configurations trying to conclude which are the ones that contribute better to improve overall system performance.Our experiments show that grounding accounts with close to 80% of the system performance without training, whereas training supposes a relative improvement of 7.6%. Finally we show how lexical expansion using external linguistic resources can consistently improve the results from 0.8% up to 2.5%.
更多
查看译文
关键词
Question answering,Semantic parsing,Linked data,Grounding
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要