Learning To Learn And Compositionality With Deep Recurrent Neural Networks

KDD(2016)

引用 2|浏览24
暂无评分
摘要
Deep neural network representations play an important role in computer vision, speech, computational linguistics, robotics, reinforcement learning and many other data-rich domains. In this talk I will show that learning-to-learn and compositionality are key ingredients for dealing with knowledge transfer so as to solve a wide range of tasks, for dealing with small-data regimes, and for continual learning. I will demonstrate this with several examples from my research team: learning to learn by gradient descent [1], neural programmers and interpreters [3], and learning communication [2].
更多
查看译文
关键词
Deep Learning,Recurrent Neural Networks,Compositionality,Transfer Learning,Learning to Learn,Abstraction,Algorithm Induction
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要