Direct Training High-Performance Deep Spiking Neural Networks: A Review of Theories and Methods
arxiv(2024)
摘要
Spiking neural networks (SNNs) offer a promising energy-efficient alternative
to artificial neural networks (ANNs), in virtue of their high biological
plausibility, rich spatial-temporal dynamics, and event-driven computation. The
direct training algorithms based on the surrogate gradient method provide
sufficient flexibility to design novel SNN architectures and explore the
spatial-temporal dynamics of SNNs. According to previous studies, the
performance of models is highly dependent on their sizes. Recently, direct
training deep SNNs have achieved great progress on both neuromorphic datasets
and large-scale static datasets. Notably, transformer-based SNNs show
comparable performance with their ANN counterparts. In this paper, we provide a
new perspective to summarize the theories and methods for training deep SNNs
with high performance in a systematic and comprehensive way, including theory
fundamentals, spiking neuron models, advanced SNN models and residual
architectures, software frameworks and neuromorphic hardware, applications, and
future trends. The reviewed papers are collected at
https://github.com/zhouchenlin2096/Awesome-Spiking-Neural-Networks
更多查看译文
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要