A Stochastic Variance Reduced Primal Dual Fixed Point Method for Linearly Constrained Separable Optimization

SIAM JOURNAL ON IMAGING SCIENCES(2021)

引用 3|浏览7
暂无评分
摘要
In this paper we combine the stochastic variance reduced gradient (SVRG) method [R. Johnson and Problems, 29 (2013)] to minimize a sum of two convex functions, one of which is linearly composite. This type of problems typically arise in sparse signal and image reconstruction. The proposed SVRGPDFP can be seen as a generalization of Prox-SVRG [L. Xiao and T. Zhang, SIAM J. Optim., 24 (2014), pp. 2057-2075] originally designed for the minimization of a sum of two convex functions. Based on some standard assumptions, we propose two variants, one for strongly convex objective functions and the other for the general convex case. Convergence analysis shows that the convergence rate of SVRG-PDFP is O(k1 ) (here k is the iteration number) for the general convex objective function and linear for the strongly convex case. Numerical examples on machine learning and computerized tomography image reconstruction are provided to show the effectiveness of the algorithms.
更多
查看译文
关键词
stochastic variance reduced gradient, primal dual fixed point method, image restoration
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要