Dual Space Preconditioning For Gradient Descent
SIAM JOURNAL ON OPTIMIZATION(2021)
摘要
The conditions of relative smoothness and relative strong convexity were recently introduced for the analysis of Bregman gradient methods for convex optimization. We introduce a generalized left-preconditioning method for gradient descent and show that its convergence on an essentially smooth convex objective function can be guaranteed via an application of relative smoothness in the dual space. Our relative smoothness assumption is between the designed preconditioner and the convex conjugate of the objective, and it generalizes the typical Lipschitz gradient assumption. Under dual relative strong convexity, we obtain linear convergence with a generalized condition number that is invariant under horizontal translations, distinguishing it from Bregman gradient methods. Thus, in principle our method is capable of improving the conditioning of gradient descent on problems with a non-Lipschitz gradient or nonstrongly convex structure. We demonstrate our method on p-norm regression and exponential penalty function minimization.
更多查看译文
关键词
convex optimization, relative smoothness, first-order method, p-norm regression, exponential penalty function, nonlinear preconditioning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络