Stochastic natural gradient descent draws posterior samples in function space

    arXiv: Learning, Volume abs/1806.09597, 2018.

    Cited by: 1|Bibtex|Views20|Links
    EI

    Abstract:

    Recent work has argued that gradient descent can approximate the Bayesian uncertainty in model parameters near local minima. In this work we develop a similar correspondence for minibatch natural gradient descent (NGD). We prove that for sufficiently small learning rates, if the model predictions on the training set approach the true con...More

    Code:

    Data:

    Your rating :
    0

     

    Tags
    Comments