Bypassing Gradients Re-Projection with Episodic Memories in Online Continual Learning

arxiv(2020)

引用 0|浏览12
暂无评分
摘要
The use of episodic memories in continual learning is an efficient way to prevent the phenomenon of catastrophic forgetting. In recent studies, several gradient-based approaches have been developed to make more efficient use of compact episodic memories, which constrain the gradients resulting from new samples with gradients from memorized samples. In this paper, we propose a method for decreasing the diversity of gradients through an auxiliary optimization objective that we call Discriminative Representation Loss, instead of directly re-projecting the gradients. Our methods show promising performance with relatively cheap computational cost on several benchmark experiments.
更多
查看译文
关键词
online continual learning,loss,representation,semi-discriminative
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要