Overfitting remedy by sparsifying regularization on fully-connected layers of CNNs.

Neurocomputing(2019)

引用 131|浏览68
暂无评分
摘要
Deep learning, especially Convolutional Neural Networks (CNNs), has been widely applied in many domains. The large number of parameters in a CNN allow it to learn complex features, however, they may tend to hinder generalization by over-fitting training data. Despite many previously proposed regularization methods, over-fitting is still a problem in training a robust CNN. Among many factors that lead to over-fitting, the numerous parameters of fully-connected layers (FCLs) of a typical CNN should be taken into account. This paper proposes the SparseConnect, a simple idea which alleviates over-fitting by sparsifying connections to FCLs. Experimental results on three benchmark datasets MNIST, CIFAR10 and ImageNet show that the SparseConnect outperforms several state-of-the-art regularization methods.
更多
查看译文
关键词
Convolutional neural networks,Fully-connected layers,Overfitting
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要