Distant context aware text generation from abstract meaning representation

APPLIED INTELLIGENCE(2021)

引用 1|浏览24
暂无评分
摘要
Text generation from abstract meaning representation is a fundamental task in natural language generation. An interesting challenge is that distant context could influence the surface realization for each node. In the previous encoder-decoder based approaches, graph neural networks have been commonly used to encode abstract meaning representation graphs and exhibited superior performance over the sequence and tree encoders. However, most of them cannot stack numerous layers, thus being too shallow to capture distant context. In this paper, we propose solutions from three aspects. Firstly, we introduce a Transformer based graph encoder to embed abstract meaning representation graphs. This encoder can stack more layers to encode larger context, while without performance degrading. Secondly, we expand the receptive field of each node, i.e. building direct connections between node pairs, to capture the information of its distant neighbors. We also exploit relative position embedding to make the model aware of the original hierarchy of graphs. Thirdly, we encode the linearized version of abstract meaning representation with the pre-trained language model to get the sequence encoding and incorporate it into graph encoding to enrich features. We conduct experiments on LDC2015E86 and LDC2017T10. Experimental results demonstrate that our method outperforms previous strong baselines. Especially, we investigate the performance of our model on large graphs, finding a larger performance gain. Our best model achieves 31.99 of BLEU and 37.02 of METEOR on LDC2015E86, 34.21 of BLEU, and 39.26 of METEOR on LDC2017T10, which are new states of the art.
更多
查看译文
关键词
Text generation,Abstract meaning representation,Graph encoder,Receptive field
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要