A Practical and Optimal First-Order Method for Large-Scale Convex Quadratic Programming

arxiv(2023)

引用 0|浏览1
暂无评分
摘要
Convex quadratic programming (QP) is an important class of optimization problem with wide applications in practice. The classic QP solvers are based on either simplex or barrier method, both of which suffer from the scalability issue because their computational bottleneck is solving linear equations. In this paper, we design and analyze a first-order method for QP, called restarted accelerated primal-dual hybrid gradient (rAPDHG), whose computational bottleneck is matrix-vector multiplication. We show that rAPDHG has a linear convergence rate to an optimal solution when solving QP, and the obtained linear rate is optimal among a wide class of primal-dual methods. Furthermore, we connect the linear rate with a sharpness constant of the KKT system of QP, which is a standard quantity to measure the hardness of a continuous optimization problem. Numerical experiments demonstrate that both restarts and acceleration can significantly improve the performance of the algorithm. Lastly, we present PDQP.jl, an open-sourced prototype implementation of rAPDHG in Julia that can be run on both CPU and GPU, and present a numerical study with SCS and OSQP on standard QP benchmark sets.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要