Spike Attention Coding for Spiking Neural Networks.

IEEE transactions on neural networks and learning systems(2023)

引用 0|浏览61
暂无评分
摘要
Spiking neural networks (SNNs), an important family of neuroscience-oriented intelligent models, play an essential role in the neuromorphic computing community. Spike rate coding and temporal coding are the mainstream coding schemes in the current modeling of SNNs. However, rate coding usually suffers from limited representation resolution and long latency, while temporal coding usually suffers from under-utilization of spike activities. To this end, we propose spike attention coding (SAC) for SNNs. By introducing learnable attention coefficients for each time step, our coding scheme can naturally unify rate coding and temporal coding, and then flexibly learn optimal coefficients for better performance. Several normalization and regularization techniques are further incorporated to control the range and distribution of the learned attention coefficients. Extensive experiments on classification, generation, and regression tasks are conducted and demonstrate the superiority of the proposed coding scheme. This work provides a flexible coding scheme to enhance the representation power of SNNs and extends their application scope beyond the mainstream classification scenario.
更多
查看译文
关键词
Rate coding,spike attention coding (SAC),spiking neural networks (SNNs),temporal coding
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要