Tensor Completion with Nearly Linear Samples GivenWeak Side Information

PROCEEDINGS OF THE ACM ON MEASUREMENT AND ANALYSIS OF COMPUTING SYSTEMS(2022)

引用 0|浏览1
暂无评分
摘要
Tensor completion exhibits an interesting computational-statistical gap in terms of the number of samples needed to perform tensor estimation. While there are only Theta(tn) degrees of freedom in a t-order tensor with n(t) entries, the best known polynomial time algorithm requires O (n(1/2)) samples in order to guarantee consistent estimation. In this paper, we show that weak side information is sufficient to reduce the sample complexity to O(n). The side information consists of a weight vector for each of the modes which is not orthogonal to any of the latent factors along that mode; this is significantly weaker than assuming noisy knowledge of the subspaces. We provide an algorithm that utilizes this side information to produce a consistent estimator with O(n(1+kappa)) samples for any small constant kappa > 0. We also provide experiments on both synthetic and real-world datasets that validate our theoretical insights.
更多
查看译文
关键词
tensor completion, side information, low rank, matrix estimation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要