MoPE: Mixture of Prefix Experts for Zero-Shot Dialogue State Tracking
arxiv(2024)
摘要
Zero-shot dialogue state tracking (DST) transfers knowledge to unseen
domains, reducing the cost of annotating new datasets. Previous zero-shot DST
models mainly suffer from domain transferring and partial prediction problems.
To address these challenges, we propose Mixture of Prefix Experts (MoPE) to
establish connections between similar slots in different domains, which
strengthens the model transfer performance in unseen domains. Empirical results
demonstrate that MoPE-DST achieves the joint goal accuracy of 57.13
MultiWOZ2.1 and 55.40
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要