Quasi-Equivalence between Width and Depth of Neural Networks

JOURNAL OF MACHINE LEARNING RESEARCH(2023)

引用 0|浏览0
暂无评分
摘要
While classic studies proved that wide networks allow universal approximation, recent re-search and successes of deep learning demonstrate the power of deep networks. Based on a symmetric consideration, we investigate if the design of artificial neural networks should have a directional preference, and what the mechanism of interaction is between the width and depth of a network. Inspired by the De Morgan law, we address this fundamental question by establishing a quasi-equivalence between the width and depth of ReLU networks. We formulate two transforms for mapping an arbitrary ReLU network to a wide ReLU network and a deep ReLU network respectively, so that the essentially same capability of the original network can be implemented. Based on our findings, a deep network has a wide equivalent, and vice versa, subject to an arbitrarily small error.
更多
查看译文
关键词
neural,depth,networks,width,quasi-equivalence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要