A novel interpretation of Nesterov's acceleration via variable step-size linear multistep methods
CoRR(2024)
摘要
Nesterov's acceleration in continuous optimization can be understood in a
novel way when Nesterov's accelerated gradient (NAG) method is considered as a
linear multistep (LM) method for gradient flow. Although the NAG method for
strongly convex functions (NAG-sc) has been fully discussed, the NAG method for
L-smooth convex functions (NAG-c) has not. To fill this gap, we show that the
existing NAG-c method can be interpreted as a variable step size LM (VLM) for
the gradient flow. Surprisingly, the VLM allows linearly increasing step sizes,
which explains the acceleration in the convex case. Here, we introduce a novel
technique for analyzing the absolute stability of VLMs. Subsequently, we prove
that NAG-c is optimal in a certain natural class of VLMs. Finally, we construct
a new broader class of VLMs by optimizing the parameters in the VLM for
ill-conditioned problems. According to numerical experiments, the proposed
method outperforms the NAG-c method in ill-conditioned cases. These results
imply that the numerical analysis perspective of the NAG is a promising working
environment, and considering a broader class of VLMs could further reveal novel
methods.
更多查看译文
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要