Generative Feature Replay For Class-Incremental Learning

CVPR Workshops(2020)

引用 107|浏览70
暂无评分
摘要
Humans are capable of learning new tasks without forgetting previous ones, while neural networks fail due to catastrophic forgetting between new and previously-learned tasks. We consider a class-incremental setting which means that the task-ID is unknown at inference time. The imbalance between old and new classes typically results in a bias of the network towards the newest ones. This imbalance problem can either be addressed by storing exemplars from previous tasks, or by using image replay methods. However, the latter can only be applied to toy datasets since image generation for complex datasets is a hard problem.We propose a solution to the imbalance problem based on generative feature replay which does not require any exemplars. To do this, we split the network into two parts: a feature extractor and a classifier. To prevent forgetting, we combine generative feature replay in the classifier with feature distillation in the feature extractor. Through feature generation, our method reduces the complexity of generative replay and prevents the imbalance problem. Our approach is computationally efficient and scalable to large datasets. Experiments confirm that our approach achieves state-of-the-art results on CIFAR-100 and ImageNet, while requiring only a fraction of the storage needed for exemplar-based continual learning. Code available at\url {this https URL}.
更多
查看译文
关键词
ImageNet,CIFAR-100,inference time,exemplar-based continual learning,generative replay,feature generation,feature distillation,feature extractor,generative feature replay,image generation,image replay methods,imbalance problem,new classes,old classes,task-ID,class-incremental setting,previously-learned tasks,catastrophic forgetting,neural networks,class-incremental learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要