Fast incremental method for smooth nonconvex optimization

    CDC, pp. 1971-1977, 2016.

    Cited by: 17|Bibtex|23|
    EI

    Abstract:

    We analyze a fast incremental aggregated gradient method for optimizing nonconvex problems of the form minΣ i ƒ i (x). Specifically, we analyze the SAGA algorithm within an Incremental First-order Oracle framework, and show that it converges to a stationary point provably faster than both gradient descent and stochastic gradient descent....More
    Your rating :
    0

     

    Tags
    Comments