Fast Incremental Method for Nonconvex Optimization.

arXiv: Optimization and Control(2016)

引用 33|浏览157
暂无评分
摘要
We analyze a fast incremental aggregated gradient method for optimizing nonconvex problems of the form $min_x sum_i f_i(x)$. Specifically, we analyze the SAGA algorithm within an Incremental First-order Oracle framework, and show that it converges to a stationary point provably faster than both gradient descent and stochastic gradient descent. We also discuss a Polyaku0027s special class of nonconvex problems for which SAGA converges at a linear rate to the global optimum. Finally, we analyze the practically valuable regularized and minibatch variants of SAGA. To our knowledge, this paper presents the first analysis of fast convergence for an incremental aggregated gradient method for nonconvex problems.
更多
查看译文
关键词
Convergence,Radio frequency,Indexes,Complexity theory,Gradient methods,Machine learning algorithms
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要