First-Order Methods for Nonconvex Quadratic Minimization

SIAM REVIEW(2020)

引用 20|浏览79
暂无评分
摘要
We consider minimization of indefinite quadratics with either trust-region (norm) constraints or cubic regularization. Despite the nonconvexity of these problems we prove that, under mild assumptions, gradient descent converges to their global solutions and give a nonasymptotic rate of convergence for the cubic variant. We also consider Krylov subspace solutions and establish sharp convergence guarantees to the solutions of both trust-region and cubic-regularized problems. Our rates mirror the behavior of these methods on convex quadratics and eigenvector problems, highlighting their scalability. When we use Krylov subspace solutions to approximate the cubic-regularized Newton step, our results recover the strongest known convergence guarantees to approximate second-order stationary points of general smooth nonconvex functions.
更多
查看译文
关键词
gradient descent,Krylov subspace methods,nonconvex quadratics,cubic regularization,trust-region methods,global optimization,Newton's method,nonasymptotic convergence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要