Gate Trimming: One-Shot Channel Pruning for Efficient Convolutional Neural Networks

2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021)(2021)

引用 5|浏览20
暂无评分
摘要
Channel pruning is a promising technique of model compression and acceleration because it reduces the space and time complexity of convolutional neural networks (CNNs) while maintaining their performance. In existing methods, channel pruning is performed by iterative optimization or training with sparsity-induced regularization, which all undermine the utility due to their inefficiency. In this work, we propose a one-shot global pruning approach called Gate Trimming (GT), which is more efficient to compress the CNNs. To achieve this, GT performs the pruning operation once, avoiding expensive retraining or re-evaluation of channel redundancy. In addition, GT globally estimates the effect of channels across all layers by information gain (IG). Based on the IG of channels, GT accurately prunes the redundant channels and makes little negative effect on CNNs. The experimental results show that the proposed GT is superior to the state-of-the-art methods.
更多
查看译文
关键词
Convolutional Neural Networks, Model Compression and Acceleration, Channel Pruning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要