The Complexity of Making the Gradient Small in Stochastic Convex Optimization
COLT, pp. 1319-1345, 2019.
We give nearly matching upper and lower bounds on the oracle complexity of finding $epsilon$-stationary points ($| nabla F(x) | leqepsilon$) in stochastic convex optimization. We jointly analyze the oracle complexity in both the local stochastic oracle model and the global oracle (or, statistical learning) model. This allows us to decompo...More
PPT (Upload PPT)
Best Paper of COLT, 2019