From Halpern’s fixed-point iterations to Nesterov’s accelerated interpretations for root-finding problems

Computational Optimization and Applications(2024)

引用 0|浏览2
暂无评分
摘要
We derive an equivalent form of Halpern’s fixed-point iteration scheme for solving a co-coercive equation (also called a root-finding problem), which can be viewed as a Nesterov’s accelerated interpretation. We show that one method is equivalent to another via a simple transformation, leading to a straightforward convergence proof for Nesterov’s accelerated scheme. Alternatively, we directly establish convergence rates of Nesterov’s accelerated variant, and as a consequence, we obtain a new convergence rate of Halpern’s fixed-point iteration. Next, we apply our results to different methods to solve monotone inclusions, where our convergence guarantees are applied. Since the gradient/forward scheme requires the co-coerciveness of the underlying operator, we derive new Nesterov’s accelerated variants for both recent extra-anchored gradient and past-extra anchored gradient methods in the literature. These variants alleviate the co-coerciveness condition by only assuming the monotonicity and Lipschitz continuity of the underlying operator. Interestingly, our new Nesterov’s accelerated interpretation of the past-extra anchored gradient method involves two past-iterate correction terms. This formulation is expected to guide us developing new Nesterov’s accelerated methods for minimax problems and their continuous views without co-coericiveness. We test our theoretical results on two numerical examples, where the actual convergence rates match well the theoretical ones up to a constant factor.
更多
查看译文
关键词
Halpern’s fixed-point iteration,Nesterov’s accelerated method,Co-coercive equation,Maximally monotone inclusion,Extra-anchored gradient method
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要