Learning a dual-branch classifier for class incremental learning

Applied Intelligence(2022)

引用 0|浏览8
暂无评分
摘要
Catastrophic forgetting is a non-trivial challenge for class incremental learning, which is caused by new knowledge learning and data imbalance between old and new classes. To alleviate this challenge, we propose a class incremental learning method with dual-branch classifier. First, inspired by ensemble learning, the proposed method constructs a dual network consisting of two complementary branches to alleviate the impact of data imbalance. Second, activation transfer loss is employed to reduce the catastrophic forgetting from the view of feature representation, preserving the feature separability of old classes. Third, we use the nearest class mean classifier with natural advantages for classification. Moreover, we formulate an end-to-end training algorithm for the feature extraction and classifier, to boost module matching degree. Extensive evaluation results show our proposed method achieves nice incremental recognition ability with less training time. Moreover, the ablation study shows the importance and necessity of dual-branch structure, end-to-end training, and activation transfer loss.
更多
查看译文
关键词
Deep learning,Incremental learning,Ensemble learning,Distillation loss,Image classification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要