GAND-Nets: Training Deep Spiking Neural Networks with Ternary Weights

2022 IEEE 35th International System-on-Chip Conference (SOCC)(2022)

引用 3|浏览4
暂无评分
摘要
Spiking neural networks (SNNs) have emerged as a promising alternative to ANNs for their energy efficiency on resource-constrained devices via event-driven sparsity. However, the state-of-art SNNs utilize longer time steps and full precision weights to implement complex image classification applications, which leads to significant latency and computational cost for hardware implementation. To this end, this paper leverages the surrogate gradient-based SNN model and threshold-based ternary weight paradigm to combine the efficiency gains of spike input {0, 1} and discrete ternary weight {−1, 0, 1}. In this manner, the internal state of SNN can be accelerated by binary-ternary dot product to replace multiply and accumulation operations. Moreover, binary-ternary dot products can design as gated AND networks (GAND-Nets). Since only the event-driven non-zero activation enables the control gate to start the AND logic operations, which towards energy-efficient edge intelligence. For proof-of-concept, we evaluate the proposed GAND-Nets on CIFAR-10, CIFAR-100, and NMNIST datasets with stochastic rate encoding, which achieve 87.42%, 63.42% and 98.43% accuracy with fewer time step, and provide 1 bit-width binary-ternary dot product accelerations.
更多
查看译文
关键词
GAND-Nets,spiking neural network,ternary weights,SNN compression
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要