Geometric Rescaling Algorithms for Submodular Function Minimization

SODA '18: Symposium on Discrete Algorithms New Orleans Louisiana January, 2018(2017)

引用 26|浏览46
暂无评分
摘要
We present a new class of polynomial-time algorithms for submodular function minimization (SFM), as well as a unified framework to obtain strongly polynomial SFM algorithms. Our algorithms are based on simple iterative methods for the minimum-norm problem, such as the conditional gradient and Fujishige-Wolfe algorithms. We exhibit two techniques to turn simple iterative methods into polynomial-time algorithms. Firstly, we adapt the geometric rescaling technique, which has recently gained attention in linear programming, to SFM and obtain a weakly polynomial bound O((n^4·EO + n^5)log (n L)). Secondly, we exhibit a general combinatorial black-box approach to turn ε L-approximate SFM oracles into strongly polynomial exact SFM algorithms. This framework can be applied to a wide range of combinatorial and continuous algorithms, including pseudo-polynomial ones. In particular, we can obtain strongly polynomial algorithms by a repeated application of the conditional gradient or of the Fujishige-Wolfe algorithm. Combined with the geometric rescaling technique, the black-box approach provides an O((n^5·EO +n^6)log^2n) algorithm. Finally, we show that one of the techniques we develop in the paper can also be combined with the cutting-plane method of Lee, Sidford, and Wong , yielding a simplified variant of their O(n^3 log^2 n ·EO + n^4log^O(1) n) algorithm.
更多
查看译文
关键词
submodular function minimization, gradient methods, rescaling
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要