Importance Guided Query Focused Long-Input Summarization

2023 IEEE 9th International Conference on Cloud Computing and Intelligent Systems (CCIS)(2023)

引用 0|浏览8
暂无评分
摘要
Query focused long-document summarization aims to generate a summary depending on both the given query and the given long text. Previous end-to-end method inputs the query and long text into a single model, and generates the summary. However, it’s a difficult task for the end-to-end model to recognize which parts of the long text are more important for answering the query. Besides, the length of the query is too small, so the information of the query is diluted after concatenating with the long text. As a result, many summarized sentences are irrelevant to the query. In this paper, we propose IGQFS (Importance Guided Query Focused long-input Summarization), which predicts an importance score for each sentence as an auxiliary task, and simultaneously generates the summary. To this end, we design an improved cross-attention module for the decoder according to the predicted utterance importance scores. Experimental results on QMSum dataset show that the auxiliary task and improved cross attention module generate summaries more related to the queries, and our method performs better than the end-to-end baseline model.
更多
查看译文
关键词
Query focused summarization,Utterance importance scores prediction,Utterance importance scores guided cross attention
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要