Matrix anti-concentration inequalities with applications

arXiv (Cornell University)(2022)

引用 0|浏览0
暂无评分
摘要
We provide a polynomial lower bound on the minimum singular value of an $m\times m$ random matrix $M$ with jointly Gaussian entries, under a polynomial bound on the matrix norm and a global small-ball probability bound $$\inf_{x,y\in S^{m-1}}\mathbb{P}\left(\left|x^* M y\right|>m^{-O(1)}\right)\ge \frac{1}{2}.$$ With the additional assumption that $M$ is self-adjoint, the global small-ball probability bound can be replaced by a weaker version. We establish two matrix anti-concentration inequalities, which lower bound the minimum singular values of the sum of independent positive semidefinite self-adjoint matrices and the linear combination of independent random matrices with independent Gaussian coefficients. Both are under a global small-ball probability assumption. As a major application, we prove a better singular value bound for the Krylov space matrix, which leads to a faster and simpler algorithm for solving sparse linear systems. Our algorithm runs in $\tilde{O}\left(n^{\frac{3\omega-4}{\omega-1}}\right)=O(n^{2.2716})$ time where $\omega<2.37286$ is the matrix multiplication exponent, improving on the previous fastest one in $\tilde{O}\left(n^{\frac{5\omega-4}{\omega+1}}\right)=O(n^{2.33165})$ time by Peng and Vempala.
更多
查看译文
关键词
matrix,anti-concentration
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要