Automatic Network Pruning via Hilbert-Schmidt Independence Criterion Lasso under Information Bottleneck Principle.

Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)(2023)

引用 4|浏览39
暂无评分
摘要
Most existing neural network pruning methods hand-crafted their importance criteria and structures to prune. This constructs heavy and unintended dependencies on heuristics and expert experience for both the objective and the parameters of the pruning approach. In this paper, we try to solve this problem by introducing a principled and unified framework based on Information Bottleneck (IB) theory, which further guides us to an automatic pruning approach. Specifically, we first formulate the channel pruning problem from an IB perspective, and then implement the IB principle by solving a Hilbert-Schmidt Independence Criterion (HSIC) Lasso problem under certain conditions. Based on the theoretical guidance, we then provide an automatic pruning scheme by searching for global penalty coefficients. Verified by extensive experiments, our method yields state-of-the-art performance on various benchmark networks and datasets. For example, with VGG-16, we achieve a 60%-FLOPs reduction by removing 76% of the parameters, with an improvement of 0.40% in top-1 accuracy on CIFAR-10. With ResNet-50, we achieve a 56%-FLOPs reduction by removing 50% of the parameters, with a small loss of 0.08% in the top-1 accuracy on ImageNet. The code is available at https://github.com/sunggo/APIB.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要