From Feedforward to Recurrent LSTM Neural Networks for Language Modeling

Audio, Speech, and Language Processing, IEEE/ACM Transactions  (2015)

引用 579|浏览434
暂无评分
摘要
Language models have traditionally been estimated based on relative frequencies, using count statistics that can be extracted from huge amounts of text data. More recently, it has been found that neural networks are particularly powerful at estimating probability distributions over word sequences, giving substantial improvements over state-of-the-art count models. However, the performance of neural network language models strongly depends on their architectural structure. This paper compares count models to feedforward, recurrent, and long short-term memory (LSTM) neural network variants on two large-vocabulary speech recognition tasks. We evaluate the models in terms of perplexity and word error rate, experimentally validating the strong correlation of the two quantities, which we find to hold regardless of the underlying type of the language model. Furthermore, neural networks incur an increased computational complexity compared to count models, and they differently model context dependences, often exceeding the number of words that are taken into account by count based approaches. These differences require efficient search methods for neural networks, and we analyze the potential improvements that can be obtained when applying advanced algorithms to the rescoring of word lattices on large-scale setups.
更多
查看译文
关键词
error statistics,feedforward neural nets,linguistics,natural language processing,recurrent neural nets,speech recognition,vocabulary,computational complexity,context dependences,count models,feedforward lstm neural networks,language modeling,large-vocabulary speech recognition tasks,long short-term memory neural network,perplexity,quantities correlation,recurrent lstm neural networks,word error rate,word lattices,feedforward neural network,kneser-ney smoothing,long short-term memory (lstm),recurrent neural network (rnn),feedforward neural networks,speech,recurrent neural networks,lattices
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要