A Few Guidelines for Incremental Few-Shot Segmentation

arxiv(2021)

引用 0|浏览57
暂无评分
摘要
Reducing the amount of supervision required by neural networks is especially important in the context of semantic segmentation, where collecting dense pixel-level annotations is particularly expensive. In this paper, we address this problem from a new perspective: Incremental Few-Shot Segmentation. In particular, given a pretrained segmentation model and few images containing novel classes, our goal is to learn to segment novel classes while retaining the ability to segment previously seen ones. In this context, we discover, against all beliefs, that fine-tuning the whole architecture with these few images is not only meaningful, but also very effective. We show how the main problems of end-to-end training in this scenario are i) the drift of the batch-normalization statistics toward novel classes that we can fix with batch renormalization and ii) the forgetting of old classes, that we can fix with regularization strategies. We summarize our findings with five guidelines that together consistently lead to the state of the art on the COCO and Pascal-VOC 2012 datasets, with different number of images per class and even with multiple learning episodes.
更多
查看译文
关键词
segmentation,few guidelines,few-shot
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要