Bayesian Sharpness-Aware Prompt Tuning for Cross-Domain Few-shot Learning.

Shuo Fan,Liansheng Zhuang, Li Aodi


引用 0|浏览19
Few-shot learning aims to learn a classifier to recognize novel classes with only few labeled images in each class. Fine-tuning is a promising tool to solve the few-shot learning problem, which pre-trains a large-scale model on source domains and then adapts it to target domains. However, existing methods have poor generalization when encountering the domain-shifting problem in the cross-domain scenario. Inspired by recent advances on domain generalization and prompt-based tuning methods, this paper proposes Bayesian Sharpness-Aware Prompt Tuning (BSAPT) for the cross-domain few-shot learning task. Instead of learning deterministic prompts like existing methods, our BSAPT learns a weight distribution over prompts to model the uncertainty caused by limited training data and resist overfitting. To improve the generalization ability, our BSAPT seeks the prompts which lie in neighborhoods having uniformly low loss by simultaneously minimizing the training loss value and loss sharpness. Benefiting from deterministic pre-trained training and Bayesian inference, our BSAPT has better generalization ability and less overfitting than existing fine-tuning methods. Extensive experiments on public datasets show that our BSAPT outperforms state-of-the-art methods and achieves new state-ofthe-art performance in the cross-domain few-shot learning task.
few-shot learning,visual prompt tuning,Bayesian inference,model generalization
AI 理解论文
Chat Paper