Preconditioning Admm For Fast Decentralized Optimization

2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING(2020)

引用 3|浏览15
暂无评分
摘要
In this work, we consider the distributed optimization problem using networked computing machines. Specifically, we are interested in solving this problem using the alternating direction method of multipliers (ADMM) while accounting for edge weights. Existing works focus on star graphs and use simple heuristics for other types of graphs. The present work shows that the optimal edge weights design is equivalent to the preconditioning matrix of ADMM that leads to the fastest convergence speed. Based on a tight convergence rate of ADMM, we show that the preconditioning matrix of general graphs can be found by minimizing the ratio of the largest and smallest nonzero eigenvalue of the graph Laplacian. Numerical experiments show that preconditioned ADMM converges much faster to a certain accuracy with less communication rounds, and exemplify the robustness to topology changes of the underlying network.
更多
查看译文
关键词
Decentralized optimization, ADMM, preconditioning, hybrid ADMM
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要