Optimal Rates of Approximation by Shallow ReLU ^k Neural Networks and Applications to Nonparametric Regression

Constructive Approximation(2024)

引用 0|浏览4
暂无评分
摘要
We study the approximation capacity of some variation spaces corresponding to shallow ReLU ^k neural networks. It is shown that sufficiently smooth functions are contained in these spaces with finite variation norms. For functions with less smoothness, the approximation rates in terms of the variation norm are established. Using these results, we are able to prove the optimal approximation rates in terms of the number of neurons for shallow ReLU ^k neural networks. It is also shown how these results can be used to derive approximation bounds for deep neural networks and convolutional neural networks (CNNs). As applications, we study convergence rates for nonparametric regression using three ReLU neural network models: shallow neural network, over-parameterized neural network, and CNN. In particular, we show that shallow neural networks can achieve the minimax optimal rates for learning Hölder functions, which complements recent results for deep neural networks. It is also proven that over-parameterized (deep or shallow) neural networks can achieve nearly optimal rates for nonparametric regression.
更多
查看译文
关键词
Neural network,Approximation rate,Nonparametric regression,Spherical harmonic,41A25,62G08,68T07
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要