Novel Knowledge Distillation to Improve Training Accuracy of Spin-based SNN

2023 IEEE 5th International Conference on Artificial Intelligence Circuits and Systems (AICAS)(2023)

引用 0|浏览27
暂无评分
摘要
Spintronics-based magnetic tunnel junction (MTJ) devices have shown the ability working as both synapse and spike threshold neurons, which is perfectly suitable with the hardware implementation of spike neural network (SNN). It has the inherent advantage of high energy efficiency with ultra-low operation voltage due to its small nanometric size and low depinning current densities. However, hardware-based SNNs training always suffers a significant performance loss compared with original neural networks due to variations among devices and information deficiency as the weights map with device synaptic conductance. Knowledge distillation is a model compression and acceleration method that enables transferring the learning knowledge from a large machine learning model to a smaller model with minimal loss in performance. In this paper, we propose a novel training scheme based on spike knowledge distillation which helps improve the training performance of spin-based SNN (SSNN) model via transferring knowledge from a large CNN model. We propose novel distillation methodologies and demonstrate the effectiveness of the proposed method with detailed experiments on four datasets. The experimental results indicate that our proposed training scheme consistently improves the performance of SSNN model by a large margin.
更多
查看译文
关键词
SNN,magnetic tunnel junction,knowledge distillation,transfer learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要