谷歌浏览器插件
订阅小程序
在清言上使用

Mutual-learning Sequence-Level Knowledge Distillation for Automatic Speech Recognition

Neurocomputing(2020)

引用 25|浏览29
关键词
Automatic speech recognition (ASR),Model compression,Knowledge distillation (KD),Mutual learning,Connectionist temporal classification (CTC)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要