Stochastic Parallel Block Coordinate Descent for Large-Scale Saddle Point Problems.

THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE(2016)

引用 25|浏览39
暂无评分
摘要
We consider convex-concave saddle point problems with a separable structure and non-strongly convex functions. We propose an efficient stochastic block co-ordinate descent method using adaptive primal-dual updates, which enables flexible parallel optimization for large-scale problems. Our method shares the efficiency and flexibility of block coordinate descent methods with the simplicity of primal-dual methods and utilizing the structure of the separable convex-concave saddle point problem. It is capable of solving a wide range of machine learning applications, including robust principal component analysis, Lasso, and feature selection by group Lasso, etc. Theoretically and empirically, we demonstrate significantly better performance than state-of-the-art methods in all these applications.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要