ELMo+Gated Self-attention Network Based on BiDAF for Machine Reading Comprehension
2020 IEEE 11th International Conference on Software Engineering and Service Science (ICSESS)(2020)
摘要
Machine reading comprehension (MRC) has always been a significant part of artificial intelligence and the focus in the field of natural language processing (NLP). Given context paragraph, to answer its query, we need to encode complex interaction between the question and the context. In the late years, with the rapid progress of neural network model and attention theory, MRC has made great advances. Especially, attention theory has been widely used in MRC. However, the accuracy of the previous classic baseline model has some upside potential and some of them did not take into account the long context dependence and polysemy. In this paper, for resolving the above problems and further improve the model, we introduce ELMo representations and add a gated self-attention layer to the Bi-Directional Attention Flow network (BIDAF). In addition, we employ the feature reuse method and modify the linear function of answer layer to further improve the performance. In the experiment of SQuAD, we prove this model greatly exceeds the baseline BIDAF model and its performance is close to the average level of human test, which proves the validity of this model.
更多查看译文
关键词
Gated self-attention,ELMo,Machine reading comprehension
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络