Further Limitations of the Known Approaches for Matrix Multiplication.

conference on innovations in theoretical computer science(2018)

引用 32|浏览39
暂无评分
摘要
We consider the techniques behind the current best algorithms for matrix multiplication. Our results are threefold. (1) We provide a unifying framework, showing that all known matrix multiplication running times since 1986 can be achieved from a single very natural tensor - the structural tensor T_q of addition modulo an integer q. (2) We show that if one applies a generalization of the known techniques (arbitrary zeroing out of tensor powers to obtain independent matrix products in order to use the asymptotic sum inequality of Schonhage) to an arbitrary monomial degeneration of T_q, then there is an explicit lower bound, depending on q, on the bound on the matrix multiplication exponent omega that one can achieve. We also show upper bounds on the value alpha that one can achieve, where alpha is such that n * n^alpha * n matrix multiplication can be computed in n^{2+o(1)} time.(3) We show that our lower bound on omega approaches 2 as q goes to infinity. This suggests a promising approach to improving the bound on omega: for variable q, find a monomial degeneration of T_q which, using the known techniques, produces an upper bound on omega as a function of q. Then, take q to infinity. It is not ruled out, and hence possible, that one can obtain omega=2 in this way.
更多
查看译文
关键词
matrix multiplication
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要