Analytic Test Functions for Generalizable Evaluation of Convex Optimization Techniques

2020 SoutheastCon(2020)

引用 0|浏览2
暂无评分
摘要
Convex optimization algorithms such as gradient descent, quasi-Newton methods, and their variants are designed to find the global minimum of strongly convex functions. When these algorithms are applied to the minimization of non-convex functions they offer no robust theoretical guarantees. Despite the lack of guarantees, many methods still find good solutions in practice and are widely used in academia and industry to solve non-convex problems. In this paper, a set of analytic test functions and transformations are presented that can be used to quantify the expected performance of optimization algorithms on difficult (non-convex) optimization problems. The test functions and transformations in this set are used to compare and evaluate the convergence rates of stochastic gradient descent, L-BFGS, AdaGrad, and Adam.
更多
查看译文
关键词
analytic test functions,generalizable evaluation,convex optimization,quasiNewton methods,convex functions,nonconvex functions,nonconvex problems,stochastic gradient descent,global minimum,minimization,nonconvex optimization,convergence rates,L-BFGS,AdaGrad,Adam
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要