CRISP: Hybrid Structured Sparsity for Class-aware Model Pruning.
CoRR(2023)
摘要
Machine learning pipelines for classification tasks often train a universal
model to achieve accuracy across a broad range of classes. However, a typical
user encounters only a limited selection of classes regularly. This disparity
provides an opportunity to enhance computational efficiency by tailoring models
to focus on user-specific classes. Existing works rely on unstructured pruning,
which introduces randomly distributed non-zero values in the model, making it
unsuitable for hardware acceleration. Alternatively, some approaches employ
structured pruning, such as channel pruning, but these tend to provide only
minimal compression and may lead to reduced model accuracy. In this work, we
propose CRISP, a novel pruning framework leveraging a hybrid structured
sparsity pattern that combines both fine-grained N:M structured sparsity and
coarse-grained block sparsity. Our pruning strategy is guided by a
gradient-based class-aware saliency score, allowing us to retain weights
crucial for user-specific classes. CRISP achieves high accuracy with minimal
memory consumption for popular models like ResNet-50, VGG-16, and MobileNetV2
on ImageNet and CIFAR-100 datasets. Moreover, CRISP delivers up to 14$\times$
reduction in latency and energy consumption compared to existing pruning
methods while maintaining comparable accuracy. Our code is available at
https://github.com/shivmgg/CRISP/.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要