Recursive Random Fields

IJCAI(2007)

引用 28|浏览20
暂无评分
摘要
A formula in first-order logic can be viewed as a tree, with a logical connective at each node, and a knowledge base can be viewed as a tree whose root is a conjunction. Markov logic (Richardson and Domingos, 2006) makes this conjunction prob- abilistic, as well as the universal quantifiers directly under it, but the rest of the tree remains purely log- ical. This causes an asymmetry in the treatment of conjunctions and disjunctions, and of universal and existential quantifiers. We propose to over- come this by allowing the features of Markov logic networks (MLNs) to be nested MLNs. We call this representation recursive random fields (RRFs). RRFs can represent many first-order distributions exponentially more compactly than MLNs. We per- form inference in RRFs using MCMC and ICM, and weight learning using a form of backpropa- gation. Weight learning in RRFs is more power- ful than structure learning in MLNs. Applied to first-order knowledge bases, it provides a very flex- ible form of theory revision. We evaluate RRFs on the problem of probabilistic integrity constraints in databases, and obtain promising results. For example, an MLN with the formula R(X) ∧ S(X) can treat worlds that violate both R(X) and S(X) as less proba- ble than worlds that only violate one. Since an MLN acts as a soft conjunction, the groundings of R(X) and S(X) simply appear as distinct formulas. (MLNs convert the knowledge base to CNF before performing learning or inference.) This is not possible for the disjunction R(X) ∨ S(X): no distinction is made between satisfying both R(X) and S(X) and satisfying just one. Since a universally quantified formula is effectively a conjunction over all its groundings, while an existentially quantified formula is a disjunction over them, this leads to the two quantifiers being handled differently. This asymmetry can be avoided by "softening" disjunction and existential quantification in the same way that Markov logic softens conjunction and universal quantification. The result is a representation where MLNs can have nested MLNs as features. We call these recursive Markov logic networks, or recursive random fields (RRFs) for short.
更多
查看译文
关键词
logical connective,recursive random field,knowledge base,flexible form,markov logic network,existential quantifiers,first-order distribution,conjunction probabilistic,first-order logic,markov logic,nested mlns,integrity constraints,first order,random field,satisfiability,first order logic
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要