Analysis And Mitigation Of Parasitic Resistance Effects For Analog In-Memory Neural Network Acceleration

SEMICONDUCTOR SCIENCE AND TECHNOLOGY(2021)

引用 3|浏览0
暂无评分
摘要
To support the increasing demands for efficient deep neural network processing, accelerators based on analog in-memory computation of matrix multiplication have recently gained significant attention for reducing the energy of neural network inference. However, analog processing within memory arrays must contend with the issue of parasitic voltage drops across the metal interconnects, which distort the results of the computation and limit the array size. This work analyzes how parasitic resistance affects the end-to-end inference accuracy of state-of-the-art convolutional neural networks, and comprehensively studies how various design decisions at the device, circuit, architecture, and algorithm levels affect the system's sensitivity to parasitic resistance effects. A set of guidelines are provided for how to design analog accelerator hardware that is intrinsically robust to parasitic resistance, without any explicit compensation or re-training of the network parameters.
更多
查看译文
关键词
neuromorphic computing, machine learning, parasitic resistance, in-memory computing, convolutional neural networks, neural network inference, sensitivity analysis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要