Exploiting similarity-induced redundancies in correlation topology for channel pruning in deep convolutional neural networks

Jian Liu,Haijian Shao, Dan Xing,Yingtao Jiang

International Journal of Computers and Applications(2023)

引用 0|浏览0
暂无评分
摘要
The paper discusses the high computational costs associated with convolutional neural networks (CNNs) in real-world applications due to their complex structure, primarily in hidden layers. To overcome this issue, the paper proposes a novel channel pruning technique that leverages the correlation topology of feature maps generated by each CNNs layer to construct a network with fewer nodes, reducing computational costs significantly. Redundant channels exhibit a high degree of topological similarity and tend to increase as the number of network layers rises. Removing the channel corresponding to highly correlated feature maps allows retrieval of the ‘base’ set of characteristics needed by subsequent layers. The proposed channel pruning technique provides a promising approach to reducing the computational costs of deep convolutional neural networks while maintaining high performance levels. By designing a network structure optimized for specific input data types, the method results in more efficient and effective machine learning models. The pruning operation requires fine-tuning to optimize network performance, and experiments using X-ray, chest CT, and MNIST images show that the pruned network can eliminate approximately 80% of redundant channels with minimal performance deterioration (maintaining original CNNs performance at 99.2%).
更多
查看译文
关键词
correlation topology,deep convolutional neural networks,redundancies,channel,similarity-induced
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要