A Novel Group-Aware Pruning Method For Few-Shot Learning

2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)(2019)

引用 3|浏览40
暂无评分
摘要
Few-shot learning, which focuses on solving machine learning problems in the setting of scarce data, has become a hot spot in the field of neural network and machine learning recently. Inspired by BlockDrop that performs pruning compression on the network using reinforcement learning, in this paper, we present a Group-Aware Pruning (GAP) method with new unifying pruning strategies and different training/testing ways to fit few-shot learning. The proposed GAP consists of three modules, that is, a pruning module, a strategy consensus module (SCM), and a classification module. The whole support set is first fed into the pruning module to get a pruning strategy for each image. Next, the strategies are fed into SCM to fuse into a group strategy for further pruning. When the classification module makes a prediction for a query image, it no longer needs re-output the pruning strategy for the query image since pruning strategies have been unified. Note that in SCM, strategy unification is proposed to achieve the group-aware strategy, which assures an existing deep network that has been pruned by the group-aware strategy work well for few-shot learning problems. Additionally, the testing will be speeded up due to the fact that only one classification model is required in the proposed GAP. Experiments on two benchmark datasets, namely, Omingnet and miniImageNet, and comparisons with the existing state-of-the-art methods show that the accuracy of the proposed GAP is 4.94% higher than the state-of-the-art methods on the 5-way 5-shot task, which shows the effectiveness of the proposed method.
更多
查看译文
关键词
Few-shot learning, deep learning, network pruning, classification, Group-Aware Pruning (GAP)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要