A 1T2R1C ReRAM CIM Accelerator With Energy-Efficient Voltage Division and Capacitive Coupling for CNN Acceleration in AI Edge Applications

IEEE Transactions on Circuits and Systems II: Express Briefs(2023)

引用 3|浏览21
暂无评分
摘要
Resistive random-access memory (ReRAM) is widely studied in computing-in-memory (CIM) for neural network acceleration in edge devices. However, the static current of conventional 2T2R array becomes prominent with increasing computing parallelism, leading to remarkable power and area costs. To resolve this issue, a voltage-style one-transistor-two-resistor-one-capacitor ReRAM (1T2R1C) CIM accelerator using energy-efficient voltage division and capacitive coupling is proposed for convolutional neural network (CNN) acceleration in AI edge applications. The 1T2R1C cell, which is 15% smaller than the previous 2T2R cell, comprises one selection transistor, two ReRAM resistors for 1-bit weight storage, and a MOS-based capacitor. The multiply-and-accumulation (MAC) operation is realized by voltage division within a cell and capacitive coupling across different cells on a plate line. The static current during computation can be effectively reduced by the series connection of low-resistance state (LRS) and high-resistance state (HRS) ReRAM resistors within a cell and the elimination of low resistance paths in the array. A corresponding weight mapping method is also proposed to transform the multi-bit weight based on 0/1 into that on −1/1, adapting the proposed accelerator for high-precision CNN applications. By evaluation in the 28nm technology, the proposed accelerator with a 384Kb array achieves a peak energy efficiency of 400.2 TOPS/W, ~8.4X higher than previous work. The inference accuracy of CNN reaches 96.2% on the MNIST dataset.
更多
查看译文
关键词
Computing-in-memory,ReRAM,1T2R1C,capacitive coupling,weight mapping
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要