Fast incremental method for smooth nonconvex optimization
CDC, pp. 1971-1977, 2016.
We analyze a fast incremental aggregated gradient method for optimizing nonconvex problems of the form minΣ i ƒ i (x). Specifically, we analyze the SAGA algorithm within an Incremental First-order Oracle framework, and show that it converges to a stationary point provably faster than both gradient descent and stochastic gradient descent....More
Full Text (Upload PDF)