The Geometric Effects of Distributing Constrained Nonconvex Optimization Problems
2019 IEEE 8th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP)(2019)
摘要
A variety of nonconvex machine learning problems have recently been shown to have benign geometric landscapes, in which there are no spurious local minima and all saddle points are strict saddles at which the Hessian has at least one negative eigenvalue. For such problems, a variety of algorithms can converge to global minimizers. We present a general result relating the geometry of a centralized problem to its distributed extension; our result is new in considering the scenario where the centralized problem obeys a manifold constraint such as when the variables are normalized to the sphere. We show that the first/second-order stationary points of the centralized and distributed problems are one-to-one correspondent, implying that the distributed problem-in spite of its additional variables and constraints-can inherit the benign geometry of its centralized counterpart. We apply this result to show that the distributed matrix eigenvalue problem, multichannel blind deconvolution problem, and dictionary learning problem all enjoy benign geometric landscapes.
更多查看译文
关键词
distributed matrix eigenvalue problem,multichannel blind deconvolution problem,nonconvex optimization problems,nonconvex machine learning problems,spurious local minima,saddle points,strict saddles,negative eigenvalue,global minimizers,distributed extension,centralized distributed problems,benign geometry
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络