Provable non-convex projected gradient descent for a class of constrained matrix optimization problems.

arXiv: Machine Learning(2016)

引用 33|浏览78
暂无评分
摘要
We propose a simple and scalable non-convex method for low-rank PSD matrix problems with a generic (strongly) convex objective $f$, and additional matrix norm constraints. Such criteria appear in quantum state tomography and phase retrieval applications, among others. However, without careful design, existing methods quickly run into time and memory bottlenecks, as problem dimensions increase. To remedy these shortcomings, we propose the Projected Factored Gradient Descent (ProjFGD) algorithm, that operates on the low-rank factorization of the variable space. Such factorization imputes non-convexity in the optimization; nevertheless, we show that our method favors local linear convergence rate in the non-convex factored space, for a class of convex norm-constrained problems. We build our theory on a novel descent lemma, that extends recent results on the unconstrained version of the problem. Our findings are supported by empirical evidence on quantum state tomography and sparse phase retrieval applications.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要