Contrastive Graph Representations for Logical Formulas Embedding

IEEE Transactions on Knowledge and Data Engineering(2021)

引用 4|浏览4
暂无评分
摘要
Currently, the non-transparent computing process of deep learning has become a significant reason hindering its further development. The Neural-Symbolic (NS) system formed by integrating logic rules into neural networks has attracted increasing attention owing to its direct interpretability. Embedding symbolic logical formulas into a low-dimensional continuous space provides an effective way for the NS system. However, current studies are all constrained by the modeling ability for its syntactic structure and fail to preserve the intrinsic semantics in embeddings, which causes poor performance on downstream reasoning tasks. To this end, this paper proposes a novel method of Con trastive G raph R epresentations (ConGR) for logical formulas embedding. First, to improve the modeling ability for the syntactic structure, ConGR introduces a densely connected graph convolutional network (GCN) with an attention mechanism to process syntax parsing graphs of formulas. In this way, discriminative local and global embeddings of formulas are obtained at the syntax level. Second, the contrastive instances (positive or negative) for each anchor formula are generated by the transformation under the guidance of logical properties. To preserve semantic information, two types of contrast, global-local and global-global, are carried out to refine formula embeddings. Extensive experiments demonstrate that ConGR obtains superior performance against state-of-the-art baselines on entailment checking and premise selection datasets.
更多
查看译文
关键词
Formulas embedding,graph representation,contrastive learning,graph convolutional network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要