Spike Attention Coding for Spiking Neural Networks.

IEEE transactions on neural networks and learning systems(2023)

引用 0|浏览61
Spiking neural networks (SNNs), an important family of neuroscience-oriented intelligent models, play an essential role in the neuromorphic computing community. Spike rate coding and temporal coding are the mainstream coding schemes in the current modeling of SNNs. However, rate coding usually suffers from limited representation resolution and long latency, while temporal coding usually suffers from under-utilization of spike activities. To this end, we propose spike attention coding (SAC) for SNNs. By introducing learnable attention coefficients for each time step, our coding scheme can naturally unify rate coding and temporal coding, and then flexibly learn optimal coefficients for better performance. Several normalization and regularization techniques are further incorporated to control the range and distribution of the learned attention coefficients. Extensive experiments on classification, generation, and regression tasks are conducted and demonstrate the superiority of the proposed coding scheme. This work provides a flexible coding scheme to enhance the representation power of SNNs and extends their application scope beyond the mainstream classification scenario.
Rate coding,spike attention coding (SAC),spiking neural networks (SNNs),temporal coding
AI 理解论文
Chat Paper