Networked optimization with adaptive communication

Global Conference Signal and Information Processing(2013)

引用 15|浏览27
暂无评分
摘要
Methods for distributed optimization are necessary to solve large-scale problems such as those becoming more common in machine learning. The communication cost associated with transmitting large messages can become a serious performance bottleneck. We propose a consensus-based distributed algorithm to minimize a convex separable objective. Each node holds one component of the objective function, and the nodes alternate between a computation phase, where local gradient steps are performed based on the local objective, and a communication phase, where consensus steps are performed to bring the local states into agreement. The nodes use local decision rules to adaptively determine when communication is not necessary. This results in significantly lower communication costs and allows a user to tradeoff the amount of communication with the accuracy of the final output. Experiments on a cluster using simulated and real datasets illustrate the tradeoff.
更多
查看译文
关键词
distributed algorithms,gradient methods,minimisation,adaptive communication,communication cost,communication phase,computation phase,consensus-based distributed algorithm,convex separable objective minimization,distributed optimization,large message transmission,large-scale problems,local decision rules,local gradient,local objective,local states,networked optimization,objective function
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要