Pruning of Dendritic Neuron Model with Significance Constraints for Classification.

IJCNN(2023)

引用 0|浏览6
暂无评分
摘要
The dendritic neural model (DNM) simulates the information-processing mechanisms and procedures of neurons by mimicking the nonlinearity of synapses in the human brain. This allows for a better understanding of biological nervous systems and has been used to a wonderful effect in various fields. However, there are some problems with the existing DNM, such as the high complexity and the limited generalization capability. Pruning is one of the most common and crucial approaches to model compression. It is a model optimization technique that involves removing redundant values from a weight tensor to develop smaller and more efficient neural networks. A compressed neural network can enable faster running and reduced computational costs in network training. To improve the model performance, this work proposes a DNM pruning method with dendrite layer significance constraints. Our proposed method not only calculates the significance of dendrite layers but also makes the significance of a few dendrite layers in the trained model concentrated in a few dendrite layers so that the dendrite layers with low significance can be pruned away. The results of the simulation experiment on classification problems show that our proposed method outperforms the existing pruning methods in terms of network size and generalization performance.
更多
查看译文
关键词
Model compression, deep neural networks, dendritic neuron model, network pruning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要