Optimal and Adaptive Monteiro-Svaiter Acceleration

NeurIPS 2022(2022)

引用 19|浏览44
暂无评分
摘要
We develop a variant of the Monteiro-Svaiter (MS) acceleration framework that removes the need to solve an expensive implicit equation at every iteration. Consequently, for any $p\ge 2$ we improve the complexity of convex optimization with Lipschitz $p$th derivative by a logarithmic factor, matching a lower bound. We also introduce an MS subproblem solver that requires no knowledge of problem parameters, and implement it as either a second- or first-order method by solving linear systems or applying MinRes, respectively. On logistic regression our method outperforms previous second-order momentum methods, but under-performs Newton's method; simply iterating our first-order adaptive subproblem solver performs comparably to L-BFGS.
更多
查看译文
关键词
convex optimization,optimization theory,second-order methods,Monteiro-Svaiter acceleration,proximal points,momentum,Newton's method,cubic regularization,conjugate residuals,oracle complexity,optimal algorithms,adaptive methods,parameter-free methods
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要