Countering Uncertainties in In-Memory-Computing Platforms with Statistical Training, Accuracy Compensation and Recursive Test

2023 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION, DATE(2023)

引用 1|浏览15
暂无评分
摘要
In-memory-computing (IMC) has become an efficient solution for implementing neural networks on hardware. However, IMC platforms request weights in neural networks to be programmed to exact values. This is a very demanding task due to programming complexity, process variations, noise, as well as thermal effects. Accordingly, new methods should be introduced to counter such uncertainties. In this paper, we first discuss a method to train neural networks statistically with process variations modeled as correlated random variables. The statistical effect is incorporated in the cost function during training. Consequently, a neural network after statistical training becomes robust to uncertainties. To deal with variations and noise further, we also introduce a compensation method with extra layers for neural networks. These extra layers are trained offline again after the weights in the original neural network are determined to enhance the inference accuracy. Finally, we will discuss a method for testing the effect of process variations in an optical acceleration platform for neural networks. This optical platform uses Mach-Zehnder Interferometers (MZIs) to implement the multiply-accumulate operations. However, trigonometric functions in the transformation matrix of an MZI make it very sensitive to process variations. To address this problem, we apply a recursive test procedure to determine the properties of MZIs inside an optical acceleration module, so that process variations can be compensated accordingly to maintain the inference accuracy of neural networks.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要