Learning Neural Networks without Lazy Weights
2022 IEEE International Conference on Big Data and Smart Computing (BigComp)(2022)
摘要
Various approaches have been suggested for the regularization of neural networks, including the well-known Dropout and Dropconnect, which are simple and efficient to implement and therefore have been widely used. However, there is a risk of loss of well-trained weights when dropping nodes or weights randomly. In this paper, we propose a regularization method that preserves well-trained weights and...
更多查看译文
关键词
Neural networks,Regularization,Overfitting,Deep learning
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要