Enhanced Transfer Learning with Efficient Modeling and Adaptive Fusion of Knowledge Via Prompt Tuning

Minghui Xu,Zishan Guo, Yulong Zeng,Deyi Xiong

ICASSP 2024 - 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)(2024)

引用 0|浏览4
暂无评分
摘要
This work presents a novel and parameter-efficient transfer learning framework. The framework consists of two phases: knowledge modeling based on prompt decomposition and knowledge transfer based on attention. Specifically, during the first phase, we decompose the prompt into parameter spaces of different ranks and leverage their characteristics to precisely model both general knowledge and task-specific knowledge separately. During the second phase, we train an attention module to adaptively integrate task-specific knowledge and generate an instance-wise prompt, which is then further fine-tuned. Through these two stages, our approach can accurately and efficiently transfer task knowledge of different granularities and types based on the input sample during inference. Extensive experiments demonstrate that our approach outperforms the multi-task learning baselines and SOTA parameter-efficient transfer learning methods. Furthermore, despite using only 0.13% of the parameters compared to full-parameter fine-tuning, it achieves an absolute improvement ranging from 0.3 to 3.9 across different benchmarks.
更多
查看译文
关键词
Parameter-efficient fine-tuning,Prompt tuning,Natural language processing,Transfer Learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要