Performance Analysis Of Nonlinear Activation Function In Convolution Neural Network For Image Classification

INTERNATIONAL JOURNAL OF COMPUTATIONAL SCIENCE AND ENGINEERING(2020)

引用 9|浏览21
暂无评分
摘要
Deep learning architectures which are exceptionally deep have exhibited to be incredibly powerful models for image processing. As the architectures become deep, it introduces challenges and difficulties in the training process such as overfitting, computational cost, and exploding/vanishing gradients and degradation. A new state-of-the-art densely connected architecture, called DenseNets, has exhibited an exceptionally outstanding result for image classification. However, it still computationally costly to train DenseNets. The choice of the activation function is also an important aspect in training of deep learning networks because it has a considerable impact on the training and performance of a network model. Therefore, an empirical analysis of some of the nonlinear activation functions used in deep learning is done for image classification. The activation functions evaluated include ReLU, Leaky ReLU, ELU, SELU and an ensemble of SELU and ELU. Publicly available datasets Cifar-10, SVHN, and PlantVillage are used for evaluation.
更多
查看译文
关键词
deep learning, convolution neural networks, activation functions, nonlinear activation functions, image classification, rectified linear unit, exponential linear unit, scaled exponential linear unit, leaky rectified linear unit, DenseNet
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要