Memory Architectures in Recurrent Neural Network Language Models

Gábor Melis
Gábor Melis
Adhiguna Kuncoro
Adhiguna Kuncoro

international conference on learning representations, 2018.

Cited by: 40|Views96
EI

Abstract:

We compare and analyze sequential, random access, and stack memory architectures for recurrent neural network language models. Our experiments on the Penn Treebank and Wikitext-2 datasets show that stack-based memory architectures consistently achieve the best performance in terms of held out perplexity. We also propose a generalization t...More

Code:

Data:

Your rating :
0

 

Tags
Comments