Asynchronous parallel nonconvex large-scale optimization
2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP)(2017)
摘要
We propose a novel parallel asynchronous algorithmic framework for the minimization of the sum of a smooth (nonconvex) function and a convex (nonsmooth) regularizer. The framework hinges on Successive Convex Approximation (SCA) techniques and on a novel probabilistic model which describes in a unified way a variety of asynchronous settings in a more faithful and exhaustive way with respect to state-of-the-art models. Key features of our framework are: i) it accommodates inconsistent read, meaning that components of the variables may be written by some cores while being simultaneously read by others; ii) it covers in a unified way several existing methods; and iii) it accommodates a variety of parallel computing architectures. Almost sure convergence to stationary solutions is proved for the general case, and iteration complexity analysis is given for a specific version of our model. Numerical results show that our scheme outperforms existing asynchronous ones.
更多查看译文
关键词
Asynchronous algorithms, big-data, inconsistent read, nonconvex constrained optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络