Gradient Descent Converges to Minimizers.

arXiv: Machine Learning(2016)

引用 183|浏览89
暂无评分
摘要
We show that gradient descent converges to a local minimizer, almost surely with random initialization. This is proved by applying the Stable Manifold Theorem from dynamical systems theory.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要