Distributed sparse regression by consensus-based primal-dual perturbation optimization

GlobalSIP(2013)

引用 4|浏览5
暂无评分
摘要
This paper studies the decentralized solution of a multi-agent sparse regression problem in the form of a globally coupled objective function with a non-smooth sparsity promoting constraint. In particular, we propose a distributed primal-dual perturbation (PDP) method which combines the average consensus technique and the primaldual perturbed subgradient method. Compared to the conventional primal-dual (PD) subgradient method without perturbation, the PDP subgradient method exhibits a faster convergence behavior. In order to handle the non-smooth constraints, we propose a novel proximal gradient type perturbation point. The proposed distributed optimization algorithm can be implemented as a fully decentralized protocol, with each agent using its local information and exchanging messages between neighbors only. We show that the proposed method converges to the global optimum of the considered problem under standard convex problem and network assumptions.
更多
查看译文
关键词
optimisation,primal-dual subgradient method,regression analysis,sparse regression,nonsmooth sparsity,distributed sparse regression,pdp subgradient method,globally coupled objective function,multi-agent systems,decentralized solution,convergence behavior,gradient methods,multiagent sparse regression problem,average consensus technique,proximal gradient type perturbation point,distributed primal-dual perturbation,distributed optimization,primal-dual perturbation optimization,average consensus,multi agent systems
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要