Conditions for linear convergence of the gradient method for non-convex optimization

arxiv(2023)

引用 2|浏览0
暂无评分
摘要
In this paper, we derive a new linear convergence rate for the gradient method with fixed step lengths for non-convex smooth optimization problems satisfying the Polyak-Łojasiewicz (PŁ) inequality. We establish that the PŁ inequality is a necessary and sufficient condition for linear convergence to the optimal value for this class of problems. We list some related classes of functions for which the gradient method may enjoy linear convergence rate. Moreover, we investigate their relationship with the PŁ inequality.
更多
查看译文
关键词
Weakly convex optimization,Gradient method,Performance estimation problem,Polyak-Lojasiewicz inequality,Semidefinite programming
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要