Convergence Rates for Deterministic and Stochastic Subgradient Methods without Lipschitz Continuity

SIAM JOURNAL ON OPTIMIZATION(2019)

引用 29|浏览7
暂无评分
摘要
We extend the classic convergence rate theory for subgradient methods to apply to non-Lipschitz functions. For the deterministic projected subgradient method, we present a global O(1/root T) convergence rate for any convex function which is locally Lipschitz around its minimizers. This approach is based on Shor's classic subgradient analysis and implies generalizations of the standard convergence rates for gradient descent on functions with Lipschitz or Holder continuous gradients. Further, we show a O(1/root T) convergence rate for the stochastic projected subgradient method on convex functions with at most quadratic growth, which improves to O(1/T) under either strong convexity or a weaker quadratic lower bound condition.
更多
查看译文
关键词
convex optimization,subgradient method,convergence,non-Lipschitz optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要