Reading More Efficiently: Multi-Sentence Summarization With A Dual Attention And Copy-Generator Network

PRICAI 2018: TRENDS IN ARTIFICIAL INTELLIGENCE, PT I(2018)

引用 0|浏览20
暂无评分
摘要
Sequence-to-sequence neural networks with attention have been widely used in text summarization as the amount of textual data has exploded in recent years. The traditional approach to automatic summarization is based only on word attention and most of them focus on generating a single sentence summarization. In this work, we propose a novel model with a dual attention that considers both sentence and word information and then generates a multi-sentence summarization word by word. Additionally, we enhance our model with a copy-generator network to solve the out-of-vocabulary ( OOV) problem. The model shows significant performance gains on the CNN/DailyMail corpus compared with the baseline model. Experimental results demonstrate that our method can obtain ROUGE-1 points of 37.48, ROUGE-2 points of 16.40 and ROUGE-L points of 34.36. Our work shows that several features of our proposed model contribute to further improvements in performance.
更多
查看译文
关键词
Text summarization, Dual attention, Copy-generator network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要