Complexity of Neural Network Training and ETR: Extensions with Effectively Continuous Functions

AAAI 2024(2024)

引用 0|浏览0
暂无评分
摘要
The training problem of neural networks (NNs) is known to be ER-complete with respect to ReLU and linear activation functions. We show that the training problem for NNs equipped with arbitrary activation functions is polynomial-time bireducible to the existential theory of the reals extended with the corresponding activation functions. For effectively continuous activation functions (e.g., the sigmoid function), we obtain an inclusion to low levels of the arithmetical hierarchy. Consequently, the sigmoid activation function leads to the existential theory of the reals with the exponential function, and hence the decidability of training NNs using the sigmoid activation function is equivalent to the decidability of the existential theory of the reals with the exponential function, a long-standing open problem. In contrast, we obtain that the training problem is undecidable if sinusoidal activation functions are considered.
更多
查看译文
关键词
ML: Deep Neural Architectures and Foundation Models,ML: Deep Learning Theory
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要