Classification Accuracy Improvement for Neuromorphic Computing Systems with One-level Precision Synapses

2017 22nd Asia and South Pacific Design Automation Conference (ASP-DAC)(2017)

引用 18|浏览48
暂无评分
摘要
Brain inspired neuromorphic computing has demonstrated remarkable advantages over traditional von Neumann architecture for its high energy efficiency and parallel data processing. However, the limited resolution of synaptic weights degrades system accuracy and thus impedes the use of neuromorphic systems. In this work, we propose three orthogonal methods to learn synapses with one-level precision, namely, distribution-aware quantization, quantization regularization and bias tuning, to make image classification accuracy comparable to the state-of-the-art. Experiments on both multi-layer perception and convolutional neural networks show that the accuracy drop can be well controlled within 0.19% (5.53%) for MNIST (CIFAR-10) database, compared to an ideal system without quantization.
更多
查看译文
关键词
brain inspired neuromorphic computing system,one-level precision synapses,energy efficiency,parallel data processing,distribution-aware quantization,quantization regularization,bias tuning,image classification accuracy,multilayer perception,convolutional neural networks,MNIST database
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要