SANIA: Polyak-type Optimization Framework Leads to Scale Invariant Stochastic Algorithms
CoRR(2023)
摘要
Adaptive optimization methods are widely recognized as among the most popular
approaches for training Deep Neural Networks (DNNs). Techniques such as Adam,
AdaGrad, and AdaHessian utilize a preconditioner that modifies the search
direction by incorporating information about the curvature of the objective
function. However, despite their adaptive characteristics, these methods still
require manual fine-tuning of the step-size. This, in turn, impacts the time
required to solve a particular problem. This paper presents an optimization
framework named SANIA to tackle these challenges. Beyond eliminating the need
for manual step-size hyperparameter settings, SANIA incorporates techniques to
address poorly scaled or ill-conditioned problems. We also explore several
preconditioning methods, including Hutchinson's method, which approximates the
Hessian diagonal of the loss function. We conclude with an extensive empirical
examination of the proposed techniques across classification tasks, covering
both convex and non-convex contexts.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要