A Novel Catalyst Scheme for Stochastic Minimax Optimization

arxiv(2023)

引用 0|浏览2
暂无评分
摘要
This paper presents a proximal-point-based catalyst scheme for simple first-order methods applied to convex minimization and convex-concave minimax problems. In particular, for smooth and (strongly)-convex minimization problems, the proposed catalyst scheme, instantiated with a simple variant of stochastic gradient method, attains the optimal rate of convergence in terms of both deterministic and stochastic errors. For smooth and strongly-convex-strongly-concave minimax problems, the catalyst scheme attains the optimal rate of convergence for deterministic and stochastic errors up to a logarithmic factor. To the best of our knowledge, this reported convergence seems to be attained for the first time by stochastic first-order methods in the literature. We obtain this result by designing and catalyzing a novel variant of stochastic extragradient method for solving smooth and strongly-monotone variational inequality, which may be of independent interest.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要