A Hierarchical Language Model Based On Variable-Length Class Sequences: The Mc Eta Nu Approach

IEEE TRANSACTIONS ON SPEECH AND AUDIO PROCESSING(2002)

引用 3|浏览7
暂无评分
摘要
In this paper, we propose a new language model which represents long-term dependencies between word sequences using a multilevel hierarchy. We call this model MCnv where n is the maximum number of words in a sequence and v is the maximum number of levels. The originality of this model, which is an extension of the multigrams, is its ability to take into account long distance dependencies according to dependent variable-length sequences. In order to discover the variable-length sequences and to build the hierarchy, we use a set of 233 syntactic classes extracted from eight elementary grammatical classes of French. The MCnv model learns hierarchical word patterns and uses them to reevaluate and filter the n-best utterance hypotheses output by our speech recognizer MAUD. The model has been trained on a corpus of 43 million words extracted from the French newspaper "Le Monde" and uses a vocabulary of 20 000 words. Tests have been conducted on 300 sentences. Compared to the class trigram and the baseline multigrams approach, we report a perplexity reduction of 17% and 20%, respectively. Rescoring the original n-best hypotheses resulted in an improvement of the word error rate: 7% and 2% compared to the class trigram and multigrams, respectively.
更多
查看译文
关键词
hierarchic model, language model, multiclass, multiclass, multigrams, n-class, n-grams, sequences, speech recognition
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要