SignReLU neural network and its approximation ability

JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS(2024)

引用 0|浏览0
暂无评分
摘要
Deep neural networks (DNNs) have garnered significant attention in various fields of science and technology in recent years. Activation functions define how neurons in DNNs process incoming signals for them. They are essential for learning non-linear transformations and for performing diverse computations among successive neuron layers. In the last few years, researchers have investigated the approximation ability of DNNs to explain their power and success. In this paper, we explore the approximation ability of DNNs using a different activation function, called SignReLU. Our theoretical results demonstrate that SignReLU networks outperform rational and ReLU networks in terms of approximation performance. Numerical experiments are conducted comparing SignReLU with the existing activations such as ReLU, Leaky ReLU, and ELU, which illustrate the competitive practical performance of SignReLU.(c) 2023 Elsevier B.V. All rights reserved.
更多
查看译文
关键词
Deep neural networks,Activation function,Approximation power,SignReLU activation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要