Tight analyses for subgradient descent I: Lower bounds1

semanticscholar(2021)

引用 0|浏览4
暂无评分
摘要
Consider the problem of minimizing functions that are Lipschitz and convex, but not necessarily differentiable. We construct a function from this class for which the T th iterate of subgradient descent has error Ω(log(T )/ √ T ). This matches a known upper bound of O(log(T )/ √ T ). We prove analogous results for functions that are additionally strongly convex. There exists such a function for which the error of the T th iterate of subgradient descent has error Ω(log(T )/T ), matching a known upper bound of O(log(T )/T ). These results resolve a question posed by Shamir (2012).
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要