The Complexity of Making the Gradient Small in Stochastic Convex Optimization

    Dylan J. Foster
    Dylan J. Foster
    Ayush Sekhari
    Ayush Sekhari
    Blake E. Woodworth
    Blake E. Woodworth

    COLT, pp. 1319-1345, 2019.

    Cited by: 13|Bibtex|Views30|Links
    EI

    Abstract:

    We give nearly matching upper and lower bounds on the oracle complexity of finding $epsilon$-stationary points ($| nabla F(x) | leqepsilon$) in stochastic convex optimization. We jointly analyze the oracle complexity in both the local stochastic oracle model and the global oracle (or, statistical learning) model. This allows us to decompo...More

    Code:

    Data:

    Your rating :
    0

     

    Best Paper
    Best Paper of COLT, 2019
    Tags
    Comments