The Complexity of Dynamic Least-Squares Regression

2023 IEEE 64TH ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE, FOCS(2023)

引用 1|浏览11
暂无评分
摘要
We settle the complexity of dynamic least-squares regression (LSR), where rows and labels (A((t)), b((t))) can be adaptively inserted and/or deleted, and the goal is to efficiently maintain an epsilon-approximate solution to min(x(t)) parallel to A((t)) x((t)) - b((t))parallel to(2) for all t is an element of [T]. We prove sharp separations (d(2-o(1)) vs. similar to d) between the amortized update time of: (i) Fully vs. Partially dynamic 0.01-LSR; (ii) High vs. low-accuracy LSR in the partially-dynamic (insertion-only) setting. Our lower bounds follow from a gap-amplification reduction-reminiscent of iterative refinement-from the exact version of the Online Matrix Vector Conjecture (OMv) [HKNS15], to constant approximate OMv over the reals, where the i-th online product Hv((i)) only needs to be computed to 0.1-relative error. All previous fine-grained reductions from OMv to its approximate versions only show hardness for inverse polynomial approximation epsilon = n(-omega(1)) (additive or multiplicative). This result is of independent interest in fine-grained complexity and for the investigation of the OMv Conjecture, which is still widely open.
更多
查看译文
关键词
Numerical linear algebra,dynamic algorithms,fine-grained complexity
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要