Monotonic alignments for summarization

Knowledge-Based Systems(2020)

引用 7|浏览107
暂无评分
摘要
Summarization is the task that creates a summary with the major points of the original document. Deep learning plays an important role in both abstractive and extractive summary generations. While a number of models show that combining the two gives good results, this paper focuses on a pure abstractive method to generate good summaries. Our model is a stacked RNN network with a monotonic alignment mechanism. Monotonic alignment has an advantage because it produces the context that is in the same sequence as the original document, at the same time eliminating repeating sequences. To obtain monotonic alignment, this paper proposes two energies that are calculated using only the previous alignment state. We use sub-word method to reduce the rate of producing OOVs(Out of Vocabulary). The dropout is used for generalization and the residual connection to overcome gradient vanishing. We experiment on CNN/daily new and Reddits dataset. Our method out-performs the previous models with monotonic alignment by 4 ROUGE-1 points and achieves the results comparable to state of the art.
更多
查看译文
关键词
Summarization,Monotonic,Alignment,Attention
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要