Distributed Block Coordinate Descent for Minimizing Partially Separable Functions

Springer Proceedings in Mathematics & Statistics(2015)

引用 52|浏览36
暂无评分
摘要
A distributed randomized block coordinate descent method for minimizing a convex function of a huge number of variables is proposed. The complexity of the method is analyzed under the assumption that the smooth part of the objective function is partially block separable. The number of iterations required is bounded by a function of the error and the degree of separability, which extends the results in Richtarik and Takac (Parallel Coordinate Descent Methods for Big Data Optimization, Mathematical Programming, DOI: 10.1007/s10107-0150901-6) to a distributed environment. Several approaches to the distribution and synchronization of the computation across a cluster of multi-core computer are described and promising computational results are provided.
更多
查看译文
关键词
Distributed coordinate descent,Empirical risk minimization,Support vector machine,Big data optimization,Partial separability,Huge-scale optimization,Iteration complexity,Expected separable over-approximation,Composite objective,Convex optimization,Communication complexity
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要