A Microarchitecture Implementation Framework for Online Learning with Temporal Neural Networks
2021 IEEE Computer Society Annual Symposium on VLSI (ISVLSI)(2021)
摘要
Temporal Neural Networks (TNNs) are spiking neural networks that use time as a resource to represent and process information, similar to the mammalian neocortex. In contrast to compute-intensive deep neural networks that employ separate training and inference phases, TNNs are capable of extremely efficient online incremental/continual learning and are excellent candidates for building edge-native ...
更多查看译文
关键词
Training,Deep learning,Microarchitecture,Neurons,Buildings,Logic gates,Very large scale integration
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要