Breaking the Softmax Bottleneck: A High-Rank RNN Language Model

international conference on learning representations, 2018.

Cited by: 224|Bibtex|Views160|Links
EI

Abstract:

We formulate language modeling as a matrix factorization problem, and show that the expressiveness of Softmax-based models (including the majority of neural language models) is limited by a Softmax bottleneck. Given that natural language is highly context-dependent, this further implies that in practice Softmax with distributed word embed...More

Code:

Data:

Your rating :
0

 

Tags
Comments