Hierarchical information matters! Improving AMR parsing with multi-granularity representation interactions

INFORMATION PROCESSING & MANAGEMENT(2024)

引用 0|浏览4
暂无评分
摘要
Meaning Representation (AMR) parsing aims to automatically translate text into a directed and acyclic semantic graph, which recently has been improved significantly by Transformerbased pre -trained models (e.g., BART and T5). However, the existing parsers still encounter errors, particularly in graph structure prediction and concept abstraction, that substantially distort text meaning. In this paper, we attempt to alleviate these issues in AMR parsing by incorporating the hierarchical structure of the input text into pre -trained models. We first present a hierarchical multi -granularity (HMG) schema to describe the hierarchical structures of the given text, containing a sentence -clause -phrase -word hierarchy. Furthermore, we propose a hierarchical multi -granularity AMR parsing framework (HMG -AMR) based on a Transformer, which explicitly integrates the HMG schema of the input text. Specifically, we introduce new inductive biases and enhance the interactions among multi -granularity representations (i.e., words, phrases, clauses and sentences) by modifying the attention mechanisms in the encoder and the decoder. We conduct extensive experiments on two public in -distribution benchmarks (i.e., AMR2.0 and AMR3.0), on three out -of -distribution benchmarks (i.e., Bio, New3, and TLP), and in several few -shot settings. The results show that HMG -AMR outperforms the solid baseline by up to 1.4% and 1.1% in terms of the Smatch scores attained on AMR2.0 and AMR3.0, respectively. In addition, HMG -AMR exhibits notable advantages in out -of -distribution and few -shot settings, showing its ability to compensate for insufficient data and adapt to diverse domains. Most notably, further analyses involving 10 linguistic probing tasks verify that incorporating the HMG schema allows the model to capture distinct linguistic properties, demonstrating the universality of the proposed framework and its potential for application in other tasks.
更多
查看译文
关键词
Abstract meaning representation parsing,Hierarchical multi-granularity schema,Transformer-based pre-trained models,Multi-granularity representation interactions
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要