A Multi-objective Particle Swarm Optimization for Neural Networks Pruning

2019 IEEE Congress on Evolutionary Computation (CEC)(2019)

引用 12|浏览15
暂无评分
摘要
There is a ruling maxim in deep learning land, bigger is better. However, bigger neural network provides higher performance but also expensive computation, memory and energy. The simplified model which preserves the accuracy of original network arouses a growing interest. A simple yet efficient method is pruning, which cuts off unimportant synapses and neurons. Therefore, it is crucial to identify important parts from the given numerous connections. In this paper, we use the evolutionary pruning method to simplify the structure of deep neural networks. A multi-objective neural networks pruning model which balances the accuracy and the sparse ratio of networks is proposed and we solve this model with particle swarm optimization (PSO) method. Furthermore, we fine-tune the network which is obtained by pruning to obtain better pruning result. The framework of alternate pruning and fine-tuning operations is used to achieve more prominent pruning effect. In experimental studies, we prune LeNet on MNIST and shallow VGGNet on CIFAR-10. Experimental results demonstrate that our method could prune over 80% weights in general with no loss of accuracy.
更多
查看译文
关键词
multiobjective particle swarm optimization,evolutionary pruning method,deep neural networks,deep learning,multiobjective neural network pruning model
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要