Neural Abstractive Summarization: A Brief Survey

2023 IEEE 3rd International Conference on Computer Communication and Artificial Intelligence (CCAI)(2023)

引用 0|浏览4
暂无评分
摘要
Due to the development of neural networks, abstractive summarization has received more attention than extractive one, and has gained significant progress in generating fluent and human-like summaries with novel expressions. Seq2seq has become the primary framework for abstractive summarization, employing encoder-decoder architecture based on RNNs or CNNs, and Transformers. In this paper, we focus on reviewing the neural models that are based on seq2seq framework for abstractive summarization. Moreover, we discuss some of the most effective techniques for improving seq2seq models and provide two challenging directions, i.e. generating query-based abstractive summaries and incorporating commonsense knowledge, for in-depth investigation.
更多
查看译文
关键词
neural network,abstractive summarization,seq2seq,Transformer,pre-trained models
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要