Pruning Convolutional Filters Using Batch Bridgeout

IEEE ACCESS(2020)

引用 2|浏览14
暂无评分
摘要
State-of-the-art computer vision models are rapidly increasing in capacity, where the number of parameters far exceeds the number required to fit the training set. This results in better optimization and generalization performance. However, the huge size of contemporary models results in large inference costs and limits their use on resource-limited devices. In order to reduce inference costs, convolutional filters in trained neural networks could be pruned to reduce the run-time memory and computational requirements during inference. However, severe post-training pruning results in degraded performance if the training algorithm results in dense weight vectors. We propose the use of Batch Bridgeout, a sparsity inducing stochastic regularization scheme, to train neural networks so that they could be pruned efficiently with minimal degradation in performance. We evaluate the proposed method on common computer vision models VGGNet, ResNet and Wide-ResNet on the CIFAR10 and CIFAR100 image classification tasks. For all the networks, experimental results show that Batch Bridgeout trained networks achieve higher accuracy across a wide range of pruning intensities compared to Dropout and weight decay regularization.
更多
查看译文
关键词
Training, Neurons, Computational modeling, Task analysis, Stochastic processes, Sparse matrices, Cost function, Bridge regularization, deep neural networks, dropout, pruning, image classification, neural network regularization, neural network training
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要