Match Memory Recurrent Networks

2016 International Joint Conference on Neural Networks (IJCNN)(2016)

引用 1|浏览42
暂无评分
摘要
Imbuing neural networks with memory and attention mechanisms allows for better generalisation with fewer data samples. By focusing only on the relevant parts of data, which is encoded in an internal "memory" format, the network is able to infer better and more reliable patterns. Most neuronal attention mechanisms are based on internal networks structures that impose a similarity metric (e.g., dot-product), followed by some (soft-)max operator. In this paper, we propose a novel attention method based on a function between neuron activities, which we term a "match function", which is augmented by a recursive softmax function. We evaluate the algorithm on the bAbI question answering dataset and show that it has stronger performance when only one memory hop is used in both terms of average score and in terms the number of solved questions. Furthermore, with three memory hops, our algorithm can solve 12/20 benchmark questions using 1000 training samples per task. This is an improvement on the previous state of the art of 9/20 solved questions, which was held by end-to-end memory networks.
更多
查看译文
关键词
match memory recurrent networks,neural networks,memory mechanisms,attention mechanisms,data samples,internal memory format,internal networks structures,soft-max operator,neuron activities,match function,recursive softmax function,bAbI question answering dataset,end-to-end memory networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要