Representation Sharing And Transfer In Deep Neural Networks
AUTOMATIC SPEECH RECOGNITION: A DEEP LEARNING APPROACH(2015)
摘要
We have emphasized in the previous chapters that in deep neural networks (DNNs) each hidden layer is a new representation of the raw input to the DNN. The representation at higher layers is more abstract than that at lower layers. In this chapter, we show that these feature representations can be shared and transferred across related tasks through techniques such as multitask and transfer learning. We will use multilingual and crosslingual speech recognition as the main example, which uses a shared-hidden-layer DNN architecture, to demonstrate these techniques.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络