Hybrid Domain Convolutional Neural Network for Memory Efficient Training.

CICAI(2021)

引用 0|浏览11
暂无评分
摘要
For many popular Convolutional Neural Networks (CNNs), memory has become one of the major constraints for their efficient training and inference on edge devices. Recently, it is shown that the bottleneck lies in the feature maps generated by convolutional layers. In this work, we propose a hybrid domain Convolutional Neural Network (HyNet) to reduce the memory footprint. Specifically, HyNet prunes the filters in the spatial domain and sparsifies the feature maps in the frequency domain. HyNet also introduces a specifically designed activation function in the frequency domain to preserve the sparsity of the feature maps while effectively strengthening training convergence. We evaluate the performance of HyNet by testing on three state-of-the-art networks (VGG, DenseNet, and ResNet) on several competitive image classification benchmarks (CIFAR-10, and ImageNet). We also compare HyNet with several memory-efficient training approaches. Overall, HyNet can reduce memory consumption by about ∼ 50% without significant accuracy loss.
更多
查看译文
关键词
memory,neural network,domain,training
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要