SNNSim: Investigation and Optimization of Large-Scale Analog Spiking Neural Networks Based on Flash Memory Devices

Jong Hyun Ko,Dongseok Kwon, Joon Hwang,Kyu-Ho Lee, Seongbin Oh,Jeonghyun Kim, Jiseong Im,Ryun-Han Koo, Jae-Joon Kim,Jong-Ho Lee

ADVANCED INTELLIGENT SYSTEMS(2024)

引用 0|浏览0
暂无评分
摘要
Spiking neural networks (SNNs) have emerged as a novel approach for reducing computational costs by mimicking the biologically plausible operations of neurons and synapses. In this article, large-scale analog SNNs are investigated and optimized at the hardware-level by using SNNSim, the novel simulator for SNNs that employ analog synaptic devices and integrate-and-fire (I&F) neuron circuits. SNNSim is a reconfigurable simulator that accurately and very quickly models the behavior of the user-defined device characteristics and returns key metrics such as area, accuracy, latency, and power consumption as output. Notably, SNNSim exhibits exceptional efficiency, as it can process the entire 10 000 Modified National Institute of Standards and Technology (MNIST) test dataset in a few seconds, whereas SPICE simulations require hours to simulate a single MNIST test data. Using SNNSim, the conversion of artificial neural networks (ANNs) to SNNs is simulated and the performance of the large-scale analog SNNs is optimized. The results enable the design of accurate, high-speed, and low-power operation of large-scale SNNs. SNNSim code is now available at https://github.com/SMDLGITHUB/SNNSim. SNNSim simulates large-scale analog spiking neural networks (SNNs) using flash memory. It optimizes hardware-level SNNs and evaluates performance metrics like area, accuracy, latency, and power consumption of analog SNNs. The efficiency of SNNSim is highlighted by its rapid processing of the networks, significantly advancing the design of energy-efficient analog SNNs.image (c) 2024 WILEY-VCH GmbH
更多
查看译文
关键词
hardware-level,integrate-and-fire neuron,spiking neural networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要