A Stochastic GDA Method With Backtracking For Solving Nonconvex (Strongly) Concave Minimax Problems

arxiv(2024)

引用 0|浏览1
暂无评分
摘要
We propose a stochastic GDA (gradient descent ascent) method with backtracking (SGDA-B) to solve nonconvex-(strongly) concave (NCC) minimax problems min_x max_y ∑_i=1^N g_i(x_i)+f(x,y)-h(y), where h and g_i for i = 1, …, N are closed, convex functions, f is L-smooth and μ-strongly concave in y for some μ≥ 0. We consider two scenarios: (i) the deterministic setting where we assume one can compute ∇ f exactly, and (ii) the stochastic setting where we have only access to ∇ f through an unbiased stochastic oracle with a finite variance. While most of the existing methods assume knowledge of the Lipschitz constant L, SGDA-B is agnostic to L. Moreover, SGDA-B can support random block-coordinate updates. In the deterministic setting, SGDA-B can compute an ϵ-stationary point within 𝒪(Lκ^2/ϵ^2) and 𝒪(L^3/ϵ^4) gradient calls when μ>0 and μ=0, respectively, where κ=L/μ. In the stochastic setting, for any p ∈ (0, 1) and ϵ >0, it can compute an ϵ-stationary point with high probability, which requires 𝒪(Lκ^3ϵ^-4log(1/p)) and 𝒪̃(L^4ϵ^-7log(1/p)) stochastic oracle calls, with probability at least 1-p, when μ>0 and μ=0, respectively. To our knowledge, SGDA-B is the first GDA-type method with backtracking to solve NCC minimax problems and achieves the best complexity among the methods that are agnostic to L. We also provide numerical results for SGDA-B on a distributionally robust learning problem illustrating the potential performance gains that can be achieved by SGDA-B.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要