Universal approximation properties of shallow quadratic neural networks

arxiv(2021)

引用 0|浏览3
暂无评分
摘要
In this paper we propose a new class of neural network functions which are linear combinations of compositions of activation functions with quadratic functions, replacing standard affine linear functions, often called neurons. We show the universality of this approximation and prove convergence rates results based on the theory of wavelets and statistical learning. We investigate the efficiency of our new approach numerically for simple test cases, by showing that it requires less numbers of neurons that standard affine linear neural networks. Similar observations are made when comparing deep (multi-layer) networks.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要