Connecting Optimization and Regularization Paths.
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018)(2018)
摘要
We study the implicit regularization properties of optimization techniques by explicitly connecting their optimization paths to the regularization paths of "corresponding" regularized problems. This surprising connection shows that iterates of optimization techniques such as gradient descent and mirror descent are point-wise close to solutions of appropriately regularized objectives. While such a tight connection between optimization and regularization is of independent intellectual interest, it also has important implications for machine learning: we can port results from regularized estimators to optimization, and vice versa. We investigate one key consequence, that borrows from the well-studied analysis of regularized estimators, to then obtain tight excess risk bounds of the iterates generated by optimization techniques.
更多查看译文
关键词
gradient descent,machine learning,vice versa
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络