Summary plus plus : Summarizing Chinese News Articles with Attention

Lecture Notes in Artificial Intelligence(2018)

引用 1|浏览97
暂无评分
摘要
We present Summary++, the model that competed in NLPCC2018's Summary task. In this paper, we describe in detail of the task, our model, the results and other aspects during our experiments. The task is News article summarization in Chinese, where one sentence is generated per article. We use a neural encoder decoder attention model with pointer generator network, and modify it to focus on words attented to rather than words predicted. Our model archive second place in the task with a score of 0.285. The highlights of our model is that it run at character level, no extra features (e.g. part of speech, dependency structure) were used and very little preprocessing were done.
更多
查看译文
关键词
Text summarization,Sequence-to-sequence,Pointer,Coverage
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要