CIFDM: Continual and Interactive Feature Distillation forMulti-Label Stream Learning

international acm sigir conference on research and development in information retrieval(2021)

引用 13|浏览16
暂无评分
摘要
ABSTRACTMulti-label learning algorithms have attracted more and more attention as of recent. This is mainly because real-world data is generally associated with multiple and non-exclusive labels, which could correspond to different objects, scenes, actions, and attributes. In this paper, we consider the following challenging multi-label stream scenario: the new labels emerge continuously in the changing environments, and are assigned to the previous data. In this setting, data mining solutions must be able to learn the new concepts and avoid catastrophic forgetting simultaneously. We propose a novel continual and interactive feature distillation-based learning framework (CIFDM), to effectively classify instances with novel labels. We utilize the knowledge from the previous tasks to learn new knowledge to solve the current task. Then, the system compresses historical and novel knowledge and preserves it while waiting for new emerging tasks. CIFDM consists of three components: 1) a knowledge bank that stores the existing feature-level compressed knowledge, and predicts the observed labels so far; 2) a pioneer module that aims to learn and predict new emerged labels based on knowledge bank.; 3) an interactive knowledge compression function which is used to compress and transfer the new knowledge to the bank, and then apply the current compressed knowledge to initialize the label embedding of the pioneer for the next task.
更多
查看译文
关键词
Stream mining, Multi-label, Neural network, Incremental learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要