Potential-Function Proofs for Gradient Methods

THEORY OF COMPUTING(2019)

引用 39|浏览47
暂无评分
摘要
This note discusses proofs of convergence for gradient methods (also called "first-order methods") based on simple potential-function arguments. We cover methods like gradient descent (for both smooth and non-smooth settings), mirror descent, and some accelerated variants. We hope the structure and presentation of these amortized-analysis proofs will be useful as a guiding principle in learning and using these proofs.
更多
查看译文
关键词
convex optimization,potential functions,amortized analysis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要