Rel-grams: a probabilistic model of relations in text

AKBC-WEKEX@NAACL-HLT(2012)

引用 33|浏览65
暂无评分
摘要
We introduce the Rel-grams language model, which is analogous to an n-grams model, but is computed over relations rather than over words. The model encodes the conditional probability of observing a relational tuple R, given that R' was observed in a window of prior relational tuples. We build a database of Rel-grams co-occurence statistics from Re-Verb extractions over 1.8M news wire documents and show that a graphical model based on these statistics is useful for automatically discovering event templates. We make this database freely available and hope it will prove a useful resource for a wide variety of NLP tasks.
更多
查看译文
关键词
re-verb extraction,probabilistic model,conditional probability,relational tuple r,prior relational tuples,nlp task,rel-grams co-occurence statistic,rel-grams language model,useful resource,n-grams model,graphical model
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要