Adaptive Stochastic Primal-Dual Coordinate Descent for Separable Saddle Point Problems

ECML/PKDD(2015)

引用 22|浏览50
暂无评分
摘要
We consider a generic convex-concave saddle point problem with a separable structure, a form that covers a wide-ranged machine learning applications. Under this problem structure, we follow the framework of primal-dual updates for saddle point problems, and incorporate stochastic block coordinate descent with adaptive stepsizes into this framework. We theoretically show that our proposal of adaptive stepsizes potentially achieves a sharper linear convergence rate compared with the existing methods. Additionally, since we can select \"mini-batch\" of block coordinates to update, our method is also amenable to parallel processing for large-scale data. We apply the proposed method to regularized empirical risk minimization and show that it performs comparably or, more often, better than state-of-the-art methods on both synthetic and real-world data sets.
更多
查看译文
关键词
Large-scale optimization,Parallel optimization,Stochastic coordinate descent,Convex-concave saddle point problems
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要