Risk bounds for aggregated shallow neural networks using Gaussian priors.

Laura Tinsi,Arnak S. Dalalyan

Annual Conference on Computational Learning Theory(2022)

引用 2|浏览2
暂无评分
摘要
Analysing statistical properties of neural networks is a central topic in statistics and machine learning. However, most results in the literature focus on the properties of the neural network minimizing the training error. The goal of this paper is to consider aggregated neural networks using a Gaussian prior. The departure point of our approach is an arbitrary aggregate satisfying the PAC-Bayesian inequality. The main contribution is a precise nonasymptotic assessment of the estimation error appearing in the PAC-Bayes bound. Our analysis is sharp enough to lead to minimax rates of estimation over Sobolev smoothness classes.
更多
查看译文
关键词
aggregated shallow neural networks,neural networks,gaussian
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要