MCML: A Novel Memory-based Contrastive Meta-Learning Method for Few Shot Slot Tagging

Proceedings of the 13th International Joint Conference on Natural Language Processing and the 3rd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)(2021)

引用 1|浏览8
暂无评分
摘要
Meta-learning is widely used for few-shot slot tagging in the task of few-shot learning. The performance of existing methods is, however, seriously affected by catastrophic forgetting. This phenomenon is common in deep learning as the training and testing modules fail to take into account historical information, i.e. previously trained episodes in the metric-based meta-learning. To overcome this predicament, we propose the Memory-based Contrastive Meta-learning (MCML) method. Specifically, we propose a learn-from-memory mechanism that use explicit memory to keep track of the label representations of previously trained episodes and propose a contrastive learning method to compare the current label embedded in the few shot episode with the historic ones stored in the memory, and an adaption-from memory mechanism to determine the output label based on the contrast between the input labels embedded in the test episode and the label clusters in the memory. Experimental results show that MCML is scalable and outperforms metric-based meta-learning and optimization-based meta-learning on all 1shot, 5-shot, 10-shot, and 20-shot scenarios of the SNIPS dataset.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要