Norm-Preservation: Why Residual Networks Can Become Extremely Deep?

IEEE Transactions on Pattern Analysis and Machine Intelligence(2021)

引用 93|浏览183
暂无评分
摘要
Augmenting neural networks with skip connections, as introduced in the so-called ResNet architecture, surprised the community by enabling the training of networks of more than 1,000 layers with significant performance gains. This paper deciphers ResNet by analyzing the effect of skip connections, and puts forward new theoretical results on the advantages of identity skip connections in neural netw...
更多
查看译文
关键词
Optimization,Training,Residual neural networks,Convolution,Numerical stability,Computer architecture
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要