Training Acceleration of Frequency Domain CNNs Using Activation Compression.

ISCAS(2023)

引用 0|浏览16
暂无评分
摘要
Reducing the complexity of training convolutional neural networks results in lower energy consumption expended during training, or higher accuracy by admitting a greater number of training epochs within a training time budget. During backpropagation, a considerable amount of temporary data is offloaded from GPU memory to CPU memory, increasing training time. In this paper, we address this training time overhead by introducing an activation compression technique for frequency domain convolutional neural networks. Applying this compression technique on frequency domain AlexNet results in activation compression of 57.7%, and a reduction of training time by 23%, with a negligible effect on classification accuracy.
更多
查看译文
关键词
activation compression technique,backpropagation,convolutional neural network,CPU memory,energy consumption,frequency domain AlexNet,frequency domain CNN training acceleration,frequency domain convolutional neural networks,GPU memory,training time budget,training time overhead
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要