Inexact Variable Metric Stochastic Block-Coordinate Descent for Regularized Optimization

Journal of Optimization Theory and Applications(2020)

引用 2|浏览111
暂无评分
摘要
Block-coordinate descent is a popular framework for large-scale regularized optimization problems with block-separable structure. Existing methods have several limitations. They often assume that subproblems can be solved exactly at each iteration, which in practical terms usually restricts the quadratic term in the subproblem to be diagonal, thus losing most of the benefits of higher-order derivative information. Moreover, in contrast to the smooth case, non-uniform sampling of the blocks has not yet been shown to improve the convergence rate bounds for regularized problems. This work proposes an inexact randomized block-coordinate descent method based on a regularized quadratic subproblem, in which the quadratic term can vary from iteration to iteration: a “variable metric.” We provide a detailed convergence analysis for both convex and non-convex problems. Our analysis generalizes, to the regularized case, Nesterov’s proposal for improving convergence of block-coordinate descent by sampling proportional to the blockwise Lipschitz constants. We improve the convergence rate in the convex case by weakening the dependency on the initial objective value. Empirical results also show that significant benefits accrue from the use of a variable metric.
更多
查看译文
关键词
Variable metric,Stochastic coordinate descent,Regularized optimization,Inexact method,Arbitrary sampling
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要