Exterior-point Operator Splitting for Nonconvex Learning

arXiv (Cornell University)(2020)

引用 0|浏览0
暂无评分
摘要
In this paper, we present the nonconvex exterior-point operator splitting (NExOS) algorithm, a novel linearly convergent first-order algorithm tailored for constrained nonconvex learning problems. We consider the problem of minimizing a convex cost function over a nonconvex constraint set, where projection onto the constraint set is single-valued around local minima. A wide range of nonconvex learning problems have this structure including (but not limited to) sparse and low-rank optimization problems. By exploiting the underlying geometry of the constraint set, NExOS finds a locally optimal point by solving a sequence of penalized problems with strictly decreasing penalty parameters. NExOS solves each penalized problem by applying an outer iteration operator splitting algorithm, which converges linearly to a local minimum of the corresponding penalized formulation under regularity conditions. Furthermore, the local minima of the penalized problems converge to a local minimum of the original problem as the penalty parameter goes to zero. We implement NExOS in the open-source Julia package NExOS.jl, which has been extensively tested on many instances from a wide variety of learning problems. We study several examples of well-known nonconvex learning problems, and we show that in spite of being general purpose, NExOS is able to compute high quality solutions very quickly and is competitive with specialized algorithms.
更多
查看译文
关键词
learning,exterior-point
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要