Coordinate Methods for Matrix Games

2020 IEEE 61st Annual Symposium on Foundations of Computer Science (FOCS)(2020)

引用 31|浏览73
暂无评分
摘要
We develop primal-dual coordinate methods for solving bilinear saddle-point problems of the form minx ∈ Xmaxy ∈ Yy T Ax which contain linear programming, classification, and regression as special cases. Our methods push existing fully stochastic sublinear methods and variance-reduced methods towards their limits in terms of per-iteration complexity and sample complexity. We obtain nearly-constant per-iteration complexity by designing efficient data structures leveraging Taylor approximations to the exponential and a binomial heap. We improve sample complexity via low-variance gradient estimators using dynamic sampling distributions that depend on both the iterates and the magnitude of the matrix entries. Our runtime bounds improve upon those of existing primal-dual methods by a factor depending on sparsity measures of the m by n matrix A. For example, when rows and columns have constant l1/l2 norm ratios, we offer improvements by a factor of m+n in the fully stochastic setting and √{m+n} in the variance-reduced setting. We apply our methods to computational geometry problems, i.e. minimum enclosing ball, maximum inscribed ball, and linear regression, and obtain improved complexity bounds. For linear regression with an elementwise nonnegative matrix, our guarantees improve on exact gradient methods by a factor of √{nnz(A)/(m+n)}.
更多
查看译文
关键词
minimax optimization,stochastic gradient methods,matrix games,linear regression
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要