Learning to Slim Deep Networks with Bandit Channel Pruning

Qing Yang, Huihui Hu,Shijie Zhao, Hongliang Zhong

2021 4th International Conference on Advanced Electronic Materials, Computers and Software Engineering (AEMCSE)(2021)

引用 0|浏览2
暂无评分
摘要
Recent years, deep neural network has achieved great success in machine vision, natural language processing, and reinforcement learning. While deploying these models on embedded devices and large clusters faces challenge in high energy consumption and low efficiency. In this paper, we propose an effective approach named Bandit Channel Pruning (BCP) to accelerate neural network by channel-level pruning.Inspired by autoML, we use Multi-Armed Bandit (MAB) method to explore and exploit the impact of each channel on model performance. Specifically, we use the loss value of model’s output as penalty term to find the set of redundant channels. In addition, we prove that the change of this loss value can be used as criterion of channel redundant. We analyze the complexity of BCP and give the upper bound of search times.Our approach is validated with several deep neural networks, including VGGNet, ResNet56, ResNet110, on different image classification datasets. Extensive experiments on these models and datasets demonstrate the performance of this method is better than state-of-the-art channel pruning methods.
更多
查看译文
关键词
model compression,network acceleration,multi-armed bandit,deep neural network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要