Deeper Neural Networks with Non-Vanishing Logistic Hidden Units - NoVa vs. ReLU Neurons.

ICMLA(2021)

引用 4|浏览7
暂无评分
摘要
The new NoVa (nonvanishing) logistic neuron activation allows deeper neural networks because its derivative is positive. So it helps mitigate the problem of vanishing gradients in deep networks. Deep neural classifiers with NoVa hidden units had better classification accuracy on the CIFAR-10, CIFAR-100, and Caltech-256 image databases compared with threshold-linear ReLU hidden units. Still simpler identity hidden units also outperformed ReLU hidden units in deep classifiers but usually had less classification accuracy than NoVa networks. NoVa hidden neurons also outperformed ReLU hidden neurons in deep convolutional neural networks.
更多
查看译文
关键词
vanishing gradient,non-vanishing logistic,ReLU activation,deep neural network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要