An Estimate Sequence for Geodesically Convex Optimization.

COLT(2018)

引用 52|浏览43
暂无评分
摘要
In this paper, we propose a Riemannian version of Nesterovu0027s Accelerated Gradient algorithm (R-AGD), and show that for geodesically smooth and strongly convex problems, within a neighborhood whose radius depends on the condition number as well as the sectional curvature of the manifold, R-AGD converges to the minimum with acceleration. Unlike the algorithm in (Liu et al., 2017) which requires the exact solution to a nonlinear equation which in turn may be intractable, our algorithm is constructive and efficiently implementable. Our proof exploits a new estimate sequence and a novel bound on the nonlinear metric distortion, both ideas may be of independent interest.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要