Sparse Random Tensors: Concentration, Regularization And Applications

ELECTRONIC JOURNAL OF STATISTICS(2021)

引用 3|浏览48
暂无评分
摘要
We prove a non-asymptotic concentration inequality for the spectral norm of sparse inhomogeneous random tensors with Bernoulli entries. For an order-k inhomogeneous random tensor T with sparsity p(max) >= c log n/n, we show that parallel to T - ET parallel to = O(root np(max) log(k-2)(n)) with high probability. The optimality of this bound up to polylog factors is provided by an information theoretic lower bound. By tensor unfolding, we extend the range of sparsity to p(max) >= c log n/n(m) with 1 <= m <= k - 1 and obtain concentration inequalities for different sparsity regimes. We also provide a simple way to regularize T such that O(root n(m)p(max)) concentration still holds down to sparsity p(max) >= c/n(m) with k/2 <= m <= k - 1. We present our concentration and regularization results with two applications: (i) a randomized construction of hypergraphs of bounded degrees with good expander mixing properties, (ii) concentration of sparsified tensors under uniform sampling.
更多
查看译文
关键词
Sparse random tensor, spectral norm, hypergraph expander, tensor sparsification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要