Feature Estimations Based Correlation Distillation for Incremental Image Retrieval

IEEE TRANSACTIONS ON MULTIMEDIA(2022)

引用 18|浏览100
暂无评分
摘要
Deep learning for fine-grained image retrieval in an incremental context is less investigated. In this paper, we explore this task to realize the model's continuous retrieval ability. That means, the model enables to perform well on new incoming data and reduce forgetting of the knowledge learned on preceding old tasks. For this purpose, we distill semantic correlations knowledge among the representations extracted from the new data only so as to regularize the parameters updates using the teacher-student framework. In particular, for the case of learning multiple tasks sequentially, aside from the correlations distilled from the penultimate model, we estimate the representations for all prior models and further their semantic correlations by using the representations extracted from the new data. To this end, the estimated correlations are used as an additional regularization and further prevent catastrophic forgetting over all previous tasks, and it is unnecessary to save the stream of models trained on these tasks. Extensive experiments demonstrate that the proposed method performs favorably for retaining performance on the already-trained old tasks and achieving good accuracy on the current task when new data are added at once or sequentially.
更多
查看译文
关键词
Task analysis, Correlation, Data models, Modeling, Training, Context modeling, Image retrieval, Incremental learning, fine-grained image retrieval, correlations distillation, feature estimation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要