Improving Abstractive Summarization via Dilated Convolution

Journal of Physics: Conference Series(2020)

引用 0|浏览0
暂无评分
摘要
Abstract In this paper, a sequence-to-sequence based hybrid neural network model is proposed for abstractive summarization. Our method utilizes Bi-directional Long Short-Term Memory (Bi-LSTM) and multi-level dilated convolutions (MDC) to capture the global semantic information and semantic-unit level information, respectively. In decoding phrase, our model generates words according to summary relevant information captured by attention mechanism. Experiment shows that this proposed model outperforms several strong baselines on both of Gigawords corpus and DUC-2004 task.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要