On Variance Reduction in Stochastic Gradient Descent and its Asynchronous VariantsEI

    Cited by: 120|Bibtex|29|

    Annual Conference on Neural Information Processing Systems, pp. 2647-2655, 2015.

    Abstract:

    We study optimization algorithms based on variance reduction for stochastic gradient descent (SGD). Remarkable recent progress has been made in this direction through development of algorithms like SAG, SVRG, SAGA. These algorithms have been shown to outperform SGD, both theoretically and empirically. However, asynchronous versions of the...More
    Your rating :
    0

     

    Tags
    Comments