Fast and Simple Mixture of Softmaxes with BPE and Hybrid-LightRNN for Language Generation

national conference on artificial intelligence, 2019.

Cited by: 7|Bibtex|Views56|Links
EI

Abstract:

Mixture of Softmaxes (MoS) has been shown to be effective at addressing the expressiveness limitation of Softmax-based models. Despite the known advantage, MoS is practically sealed by its large consumption of memory and computational time due to the need of computing multiple Softmaxes. In this work, we set out to unleash the power of Mo...More

Code:

Data:

Your rating :
0

 

Tags
Comments