Few-Shot Class Incremental Learning with Attention-Aware Self-Adaptive Prompt
arxiv(2024)
摘要
Few-Shot Class-Incremental Learning (FSCIL) models aim to incrementally learn
new classes with scarce samples while preserving knowledge of old ones.
Existing FSCIL methods usually fine-tune the entire backbone, leading to
overfitting and hindering the potential to learn new classes. On the other
hand, recent prompt-based CIL approaches alleviate forgetting by training
prompts with sufficient data in each task. In this work, we propose a novel
framework named Attention-aware Self-adaptive Prompt (ASP). ASP encourages
task-invariant prompts to capture shared knowledge by reducing specific
information from the attention aspect. Additionally, self-adaptive
task-specific prompts in ASP provide specific information and transfer
knowledge from old classes to new classes with an Information Bottleneck
learning objective. In summary, ASP prevents overfitting on base task and does
not require enormous data in few-shot incremental tasks. Extensive experiments
on three benchmark datasets validate that ASP consistently outperforms
state-of-the-art FSCIL and prompt-based CIL methods in terms of both learning
new classes and mitigating forgetting.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要