Transformer-based Natural Language Understanding and Generation

Feng Zhang,Gaoyun An,Qiuqi Ruan

2022 16th IEEE International Conference on Signal Processing (ICSP)(2022)

引用 0|浏览12
暂无评分
摘要
Facilitating the sharing of information between the two complementary tasks of Natural Language Understanding(NLU) and Natural Language Generation(NLG) is crucial to the study of Natural Language Processing(NLP). NLU extracts the core semantics from a given utterance, while NLG, in contrast, aims to construct the corresponding sentence based on the given semantics. However, model training for both research topics relies on manually annotated data, but the complexity of the annotation process involved makes it costly to acquire manually annotated data on a large scale. Also, in the existing research, few scholars have treated NLU and NLG as dual tasks. Indeed, both NLG and NLU can be approached as translation problems: NLU translates natural language into formal representations, while NLG converts formal representations into natural language. In this paper, we propose a Transformer-based Natural Language Understanding and Generation (T-NLU&G) model that jointly model NLU and NLG by introducing a shared latent variable. The model can help us explore the intrinsic connection between the natural language space and the formal representation space, and use this latent variable to facilitate information sharing between the two spaces. Experiment shows that our model achieves performance gains on both the E2E dataset and the Weather dataset, validates the feasibility and effectiveness of performance gains for the respective tasks via the T-NLU&G model, and is competitive with current state-of-the-art methods.
更多
查看译文
关键词
natural language understanding,natural language generation,transformer,dual relationship
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要