Correlation Encoder-Decoder Model for Text Generation

2022 International Joint Conference on Neural Networks (IJCNN)(2022)

引用 0|浏览8
暂无评分
摘要
Text generation is crucial for many applications in natural language processing. With the prevalence of deep learning, the encoder-decoder architecture is dominantly adopted for this task. Accurately encoding the source information is of key importance to text generation, because the target text can be generated only when accurate and complete source information is captured by the encoder and fed into the decoder. However, most existing approaches fail to effectively encode and learn the entire source information, as some features are easy to be missed along with the encoding procedures of the encoder. Similar problems also confuse the implementation of the decoder. How to reduce the problem of information loss in the encoder-decoder model is critical for text generation. To address this issue, we propose a novel correlation encoder-decoder model, which optimizes both the encoder and the decoder to reduce the problem of information loss by enforcing them to minimize the differences between hierarchical layers by maximizing the mutual information. Experimental results on two benchmark datasets demonstrate that the proposed model substantially outperforms the existing state-of-the-art methods. Our source code is publicly available on GitHub 1 .
更多
查看译文
关键词
text,generation,correlation,encoder-decoder
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要