# Interior Point Methods with a Gradient Oracle

PROCEEDINGS OF THE 55TH ANNUAL ACM SYMPOSIUM ON THEORY OF COMPUTING, STOC 2023（2023）

Abstract

We provide an interior point method based on quasi-Newton iterations, which only requires first-order access to a strongly self-concordant barrier function. To achieve this, we extend the techniques of Dunagan-Harvey [STOC '07] to maintain a preconditioner, while using only first-order information. We measure the quality of this preconditioner in terms of its relative excentricity to the unknown Hessian matrix, and we generalize these techniques to convex functions with a slowly-changing Hessian. We combine this with an interior point method to show that, given first-order access to an appropriate barrier function for a convex set K, we can solve well-conditioned linear optimization problems over K to epsilon precision in time (O) over tilde (T +n(2)) root log (1/epsilon)), where.. is the self-concordance parameter of the barrier function, and T is the time required to make a gradient query. As a consequence we show that: Linear optimization over..-dimensional convex sets can be solved in time (O) over tilde (T +n(2)) . This parallels the running time achieved by state of the art algorithms for cutting plane methods, when replacing separation oracles with first-order oracles for an appropriate barrier function. We can solve semidefinite programs involving matrices in R-nxn in time (O) over tilde (mn(4)+m(1.25)n(3.5) log(1/epsilon)) , improving over the state of the art algorithms, in the case where m = Omega(3.5/n omega-1.25) Along the way we develop a host of tools allowing us to control the evolution of our potential functions, using techniques from matrix analysis and Schur convexity.

MoreTranslated text

Key words

interior point methods,linear systems,preconditioning

AI Read Science

Must-Reading Tree

Example

Generate MRT to find the research sequence of this paper

Chat Paper

Summary is being generated by the instructions you defined