The Benefits of Over-parameterization at Initialization in Deep ReLU Networks

    arXiv: Machine Learning, 2019.

    Cited by: 0|Bibtex|Views91|Links
    EI

    Abstract:

    It has been noted in existing literature that over-parameterization in ReLU networks generally leads to better performance. While there could be several reasons for this, we investigate desirable network properties at initialization which may be enjoyed by ReLU networks. Without making any assumption, we derive a lower bound on the layer ...More

    Code:

    Data:

    Your rating :
    0

     

    Tags
    Comments