Incremental Learning Based on Dual-Branch Network

PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT III(2024)

引用 0|浏览28
暂无评分
摘要
Incremental learning aims to overcome catastrophic forgetting. When the model learns multiple tasks sequentially, due to the imbalance of new and old classes numbers, the knowledge of old classes stored in the model is destroyed by large number of new classes. The existing single-backbone model is difficult to avoid catastrophic forgetting. In this paper, we proposes to use the dual-branch network model to learn new tasks to alleviate catastrophic forgetting. Different from previous dual-branch models that learn tasks in parallel, we propose to use dual-branch network to learn tasks serially. The model creates a new backbone for learning the remaining tasks, and freezes the previous backbone. In this way, the model can reduce damage to the previous backbone parameters used to learn old tasks. The model uses knowledge distillation to preserve the information of old tasks when the model learns new tasks. We also analyze different distillation methods for the dual-branch network model. In this paper we mainly focuses on the more challenging class incremental learning. We use common incremental learning setting on the ImageNet-100 dataset. The experimental results show that the accuracy can be improved by using the dual-branch network.
更多
查看译文
关键词
incremental learning,knowledge distillation,catastrophic forgetting
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要