Controlling Covariate Shift using Equilibrium Normalization of Weights.
arXiv: Learning(2018)
摘要
We introduce a new normalization technique that exhibits the fast convergence properties of batch normalization using a transformation of layer weights instead of layer outputs. The proposed technique keeps the contribution of positive and negative weights to the layer output in equilibrium. We validate our method on a set of standard benchmarks including CIFAR-10/100, SVHN and ILSVRC 2012 ImageNet.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络