Trust-Region Newton-Cg With Strong Second-Order Complexity Guarantees For Nonconvex Optimization

SIAM JOURNAL ON OPTIMIZATION(2021)

引用 10|浏览82
暂无评分
摘要
Worst-case complexity guarantees for nonconvex optimization algorithms have been a topic of growing interest. Multiple frameworks that achieve the best known complexity bounds among a broad class of first- and second-order strategies have been proposed. These methods have often been designed primarily with complexity guarantees in mind and, as a result, represent a departure from the algorithms that have proved to be the most effective in practice. In this paper, we consider trust-region Newton methods, one of the most popular classes of algorithms for solving nonconvex optimization problems. By introducing slight modifications to the original scheme, we obtain two methods-one based on exact subproblem solves and one exploiting inexact subproblem solves as in the popular "trust-region Newton-conjugate gradient" (trust-region Newton-CG) method-with iteration and operation complexity bounds that match the best known bounds for the aforementioned class of first- and second-order methods. The resulting trust-region Newton-CG method also retains the attractive practical behavior of classical trust-region Newton-CG, which we demonstrate with numerical comparisons on a standard benchmark test set.
更多
查看译文
关键词
smooth nonconvex optimization, trust-region methods, Newton's method, conjugate gradient method, Lanczos method, worst-case complexity, negative curvature
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要