A Note on the Optimal Convergence Rate of Descent Methods with Fixed Step Sizes for Smooth Strongly Convex Functions

Journal of Optimization Theory and Applications(2022)

引用 0|浏览3
暂无评分
摘要
Based on a result by Taylor et al. (J Optim Theory Appl 178(2):455–476, 2018) on the attainable convergence rate of gradient descent for smooth and strongly convex functions in terms of function values, an elementary convergence analysis for general descent methods with fixed step sizes is presented. It covers general variable metric methods, gradient-related search directions under angle and scaling conditions, as well as inexact gradient methods. In all cases, optimal rates are obtained.
更多
查看译文
关键词
Convergence rate estimates, Variable metric method, Inexact gradient method, SR1 update, 90C25, 65K05
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要