QDG: A unified model for automatic question-distractor pairs generation

Applied Intelligence(2022)

引用 5|浏览12
暂无评分
摘要
Generating high-quality complete question sets (for example, the question, answer and distractors) in reading comprehension tasks is challenging and rewarding. This paper proposes a question-distractor joint generation framework (QDG). The framework can automatically generate both questions and distractors given a background text and the specified answer. Our work makes it possible to combine complete multiple-choice reading comprehension questions that can be better applied to educators’ work. While there have been independent studies of question generation and distractor generation in previous studies, there have been few joint question-distractor generation studies. In a past joint generation, distractors could only be constructed by generating questions first and then by sorting the answers with similar words. It was impossible to generate question-distractor pairs in an end-to-end unified joint generation approach. To the best of our knowledge, we are the first to propose an end-to-end question-distractor joint generation framework on the RACE dataset. This paper finds that distractors are somehow relevant to the background articles, by suppressing those related parts, thus enabling the generated questions to be better focused on the relevant parts of the correct answers. The experimental results show that the model achieves a giant breakthrough in the question-distractor pair generation task. The question generation task achieves better performance than baselines. For further evaluation, we also manually people for evaluation to demonstrate the educational relevance of our model in generating high-quality question-distractor pairs.
更多
查看译文
关键词
Natural language processing,Question generation,Distractor generation,Attention mechanism
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要