Hardware Implementation for Multiple Activation Functions.

ICCE-TW(2019)

引用 10|浏览12
暂无评分
摘要
In a neural network, the activation function defines the output of that node. In this paper, we propose a novel hardware implementation for AI hardware accelerators to support three popularly used activation functions, including Hyperbolic Tangent, Sigmoid and ReLU. Based on this hardware implementation, the users can configure the activation functions of nodes at the execution time. Therefore, a higher flexibility for AI hardware computation can be provided. Implementation results show that the proposed hardware works well in practice.
更多
查看译文
关键词
multiple activation functions,neural network,hardware implementation,AI hardware accelerators,AI hardware computation,hyperbolic tangent,Sigmoid,ReLU
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要