Size-Independent Sample Complexity Of Neural Networks

INFORMATION AND INFERENCE-A JOURNAL OF THE IMA(2018)

引用 540|浏览171
暂无评分
摘要
We study the sample complexity of learning neural networks by providing new bounds on their Rademacher complexity, assuming norm constraints on the parameter matrix of each layer. Compared to previous work, these complexity bounds have improved dependence on the network depth and, under some additional assumptions, are fully independent of the network size (both depth and width). These results are derived using some novel techniques, which may be of independent interest.
更多
查看译文
关键词
neural networks, deep learning, sample complexity, Rademacher complexity
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要