Tight Convergence Rate of Gradient Descent for Eigenvalue Computation

IJCAI 2020(2020)

引用 5|浏览126
暂无评分
摘要
Riemannian gradient descent (RGD) is a simple, popular and efficient algorithm for leading eigenvector computation [Absil et al., 2009]. However, the existing analysis of RGD for eigen-problem is still not tight, which is O(1/Delta(2) In n/is an element of) due to [Xu et al., 2018]. In this paper, we show that RGD in fact converges at rate O(1/Delta In n/is an element of), and give instances to show the tightness of our result. This improves the best prior analysis by a quadratic factor. Besides, we also give tight convergence analysis of a deterministic variant of Oja's rule due to [Oja, 1982]. We show that it also enjoys fast convergence rate of O(1/Delta In n/is an element of). Previous papers only gave asymptotic characterizations [Oja, 1982; Oja, 1989; Yi et al., 2005]. Our tools for proving convergence results include an innovative reduction and chaining technique, and a noisy fixed point iteration argument. Besides, we also give empirical justifications of our convergence rates over synthetic and real data.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要