ERA: Enhanced Rational Activations.

European Conference on Computer Vision(2022)

引用 2|浏览42
暂无评分
摘要
Activation functions play a central role in deep learning since they form an essential building stone of neural networks. In the last few years, the focus has been shifting towards investigating new types of activations that outperform the classical Rectified Linear Unit (ReLU) in modern neural architectures. Most recently, rational activation functions (RAFs) have awakened interest because they were shown to perform on par with state-of-the-art activations on image classification. Despite their apparent potential, prior formulations are either not safe, not smooth, or not "true" rational functions, and they only work with careful initialisation. Aiming to mitigate these issues, we propose a novel, enhanced rational function, ERA, and investigate how to better accommodate the specific needs of these activations, to both network components and training regime. In addition to being more stable, the proposed function outperforms other standard ones across a range of lightweight network architectures on two different tasks: image classification and 3d human pose and shape reconstruction.
更多
查看译文
关键词
Rational activation, Activation function, Deep learning, Neural networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络