Topmoumoute Online Natural Gradient Algorithm

NIPS(2007)

引用 246|浏览124
暂无评分
摘要
Guided by the goal of obtaining an optimization algorithm that is both fast and yields good generalization, we study the descent direction maximizing the de- crease in generalization error or the probability of not increasing generalization error. The surprising result is that from both the Bayesian and frequentist perspec- tives this can yield the natural gradient direction. Although that direction can be very expensive to compute we develop an efcient, general, online approximation to the natural gradient descent which is suited to large scale problems. We re- port experimental results showing much faster convergence in computation time and in number of iterations with TONGA (Topmoumoute Online natural Gradient Algorithm) than with stochastic gradient descent, even on very large datasets.
更多
查看译文
关键词
stochastic gradient descent,gradient descent,generalization error
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要