An accelerated minimax algorithm for convex-concave saddle point problems with nonsmooth coupling function
arxiv(2022)
摘要
In this work we aim to solve a convex-concave saddle point problem, where the convex-concave coupling function is smooth in one variable and nonsmooth in the other and not assumed to be linear in either. The problem is augmented by a nonsmooth regulariser in the smooth component. We propose and investigate a novel algorithm under the name of OGAProx , consisting of an optimistic gradient ascent step in the smooth variable coupled with a proximal step of the regulariser, and which is alternated with a proximal step in the nonsmooth component of the coupling function. We consider the situations convex-concave, convex-strongly concave and strongly convex-strongly concave related to the saddle point problem under investigation. Regarding iterates we obtain (weak) convergence, a convergence rate of order 𝒪(1/K) and linear convergence like 𝒪(θ ^K) with θ < 1 , respectively. In terms of function values we obtain ergodic convergence rates of order 𝒪(1/K) , 𝒪(1/K^2) and 𝒪(θ ^K) with θ < 1 , respectively. We validate our theoretical considerations on a nonsmooth-linear saddle point problem, the training of multi kernel support vector machines and a classification problem incorporating minimax group fairness.
更多查看译文
关键词
Saddle point problem,Convex-concave,Minimax algorithm,Convergence rate,Acceleration,Linear convergence
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要