Compositional Continual Language Learning

international conference on learning representations

引用 24|浏览94
暂无评分
摘要
Motivated by the humanu0027s ability to continually learn and gain knowledge over time, several research efforts have been pushing the limits of machines to constantly learn while alleviating catastrophic forgetting; significant drop of a machine skill accessed/gained far earlier in time. Most of the existing methods have been focusing on label prediction tasks to study continual learning. Humans, however, naturally interact and learn from natural language statements and instructions which is far less studied from continual learning angle. One of the key skills that enables humans to excel at learning language efficiently is ability to produce novel composition. To learn and complete new tasks, robots need to continually learn novel objects and concepts in a linguistic form which requires compositionality for efficient learning. Inspired by that, in this paper, we propose a method for compositional continual learning of sequence-to-sequence models. Experimental results show that the proposed method has significant improvement over state of the art methods, and it enables knowledge transfer and prevents catastrophic forgetting, resulting in more than 85% accuracy up to 100 stages, compared with less 50% accuracy for baselines. It also shows significant improvement in a machine translation task. This is the first work to combine continual learning and compositionality for natural language instruction learning, and we hope this work will make robots more helpful in various tasks.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要