Classifying unprompted speech by retraining LSTM nets

ARTIFICIAL NEURAL NETWORKS: BIOLOGICAL INSPIRATIONS - ICANN 2005, PT 1, PROCEEDINGS(2005)

引用 14|浏览0
暂无评分
摘要
We apply Long Short-Term Memory (LSTM) recurrent neural networks to a large corpus of unprompted speech- the German part of the VERBMOBIL corpus. By training first on a fraction of the data, then retraining on another fraction, we both reduce time costs and significantly improve recognition rates. For comparison we show recognition rates of Hidden Markov Models (HMMs) on the same corpus, and provide a promising extrapolation for HMM-LSTM hybrids.
更多
查看译文
关键词
hidden markov models,unprompted speech,time cost,promising extrapolation,german part,long short-term memory,lstm net,hmm-lstm hybrid,verbmobil corpus,recognition rate,recurrent neural network,large corpus,hidden markov model,long short term memory
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要