A Faster Interior-Point Method for Sum-of-Squares Optimization

International Colloquium on Automata, Languages and Programming (ICALP)(2023)

引用 4|浏览9
暂无评分
摘要
We present a faster interior-point method for optimizing sum-of-squares (SOS) polynomials, which are a central tool in polynomial optimization and capture convex programming in the Lasserre hierarchy. Let p = ∑ _i q^2_i be an n -variate SOS polynomial of degree 2 d . Denoting by L:= ( [ n+d; d ]) and U:= ( [ n+2d; 2d ]) the dimensions of the vector spaces in which q_i ’s and p live respectively, our algorithm runs in time Õ(LU^1.87) . This is polynomially faster than state-of-art SOS and semidefinite programming solvers, which achieve runtime Õ(L^0.5min{U^2.37, L^4.24}) . The centerpiece of our algorithm is a dynamic data structure for maintaining the inverse of the Hessian of the SOS barrier function under the polynomial interpolant basis , which efficiently extends to multivariate SOS optimization, and requires maintaining spectral approximations to low-rank perturbations of elementwise ( Hadamard ) products . This is the main challenge and departure from recent IPM breakthroughs using inverse-maintenance, where low-rank updates to the slack matrix readily imply the same for the Hessian matrix.
更多
查看译文
关键词
Interior point methods,Sum-of-squares optimization,Dynamic matrix inverse,Convex optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要