New Primal-Dual Algorithms for a Class of Nonsmooth and Nonlinear Convex-Concave Minimax Problems.

SIAM Journal on Optimization(2022)

引用 1|浏览14
暂无评分
摘要
In this paper, we develop new primal-dual algorithms to solve a class of nonsmooth and nonlinear convex-concave minimax problems, which covers many existing and brand-new models as special cases. Our approach relies on a combination of a generalized augmented Lagrangian function, Nesterov's accelerated scheme, and adaptive parameter updating strategies. Our algorithmic framework is single-loop and unifies two important settings: general convex-concave and convex-linear cases. Under mild assumptions, our algorithms achieve O(1/k) convergence rates through three different criteria: primal-dual gap, primal objective residual, and dual objective residual, where k is the iteration counter. Our rates are both ergodic (i.e., on a weighted averaging sequence) and nonergodic (i.e., on the last-iterate sequence). These convergence rates can be boosted up to O(1/k2) if only one objective term is strongly convex (or, equivalently, its conjugate is L-smooth). To the best of our knowledge, this is the first algorithm achieving optimal rates on the primal last-iterate sequence for convex-linear minimax problems. As a byproduct, we specify our algorithms to solve a general convex cone constrained program with both ergodic and nonergodic rate guarantees. We test our algorithms and compare them with two recent methods on two numerical examples.
更多
查看译文
关键词
&nbsp,convex-concave minimax problem,primal-dual algorithm,optimal convergence rate,last-iterate convergence rate,Nesterov's accelerated scheme,convex cone constrained program
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要