On the Power of Truncated SVD for General High-rank Matrix Estimation Problems.

ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017)(2017)

引用 25|浏览15
暂无评分
摘要
We show that given an estimate (A) over cap that is close to a general high-rank positive semi- definite (PSD) matrix A in spectral norm (i.e., parallel to(A) over cap -A parallel to(2) <= delta), the simple truncated Singular Value Decomposition of (A) over cap produces a multiplicative approximation of A in Frobenius norm. This observation leads to many interesting results on general high-rank matrix estimation problems: 1. High-rank matrix completion: we show that it is possible to recover a general high-rank matrix A up to (1 + epsilon) relative error in Frobenius norm from partial observations, with sample complexity independent of the spectral gap of A. 2. High-rank matrix denoising: we design an algorithm that recovers a matrix A with error in Frobenius norm from its noise-perturbed observations, without assuming A is exactly low-rank. 3. Low-dimensional approximation of high-dimensional covariance: given N i.i.d. samples of dimension n from N-n(0, A), we show that it is possible to approximate the covariance matrix A with relative error in Frobenius norm with N approximate to n, improving over classical covariance estimation results which requires N approximate to n(2).
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要