Dynamic ReLU

Cited by: 0|Bibtex|Views30|Links

Abstract:

Rectified linear units (ReLU) are commonly used in deep neural networks. So far ReLU and its generalizations (either non-parametric or parametric) are static, performing identically for all input samples. In this paper, we propose Dynamic ReLU (DY-ReLU), a dynamic rectifier whose parameters are input-dependent as a hyper function over a...More

Code:

Data:

Your rating :
0

 

Tags
Comments