An optimization derivation of the method of conjugate gradients

David Ek,Anders Forsgren

arxiv(2020)

引用 0|浏览6
暂无评分
摘要
We give a derivation of the method of conjugate gradients based on the requirement that each iterate minimizes a strictly convex quadratic on the space spanned by the previously observed gradients. Rather than verifying that the search direction has the correct properties, we show that generation of such iterates is equivalent to generation of orthogonal gradients which gives the description of the direction and the step length. Our approach gives a straightforward way to see that the search direction of the method of conjugate gradients is a negative scalar times the gradient of minimum Euclidean norm on the subspace generated so far.
更多
查看译文
关键词
optimization derivation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要