How Free is Parameter-Free Stochastic Optimization?
CoRR(2024)
摘要
We study the problem of parameter-free stochastic optimization, inquiring
whether, and under what conditions, do fully parameter-free methods exist:
these are methods that achieve convergence rates competitive with optimally
tuned methods, without requiring significant knowledge of the true problem
parameters. Existing parameter-free methods can only be considered
“partially” parameter-free, as they require some non-trivial knowledge of the
true problem parameters, such as a bound on the stochastic gradient norms, a
bound on the distance to a minimizer, etc. In the non-convex setting, we
demonstrate that a simple hyperparameter search technique results in a fully
parameter-free method that outperforms more sophisticated state-of-the-art
algorithms. We also provide a similar result in the convex setting with access
to noisy function values under mild noise assumptions. Finally, assuming only
access to stochastic gradients, we establish a lower bound that renders fully
parameter-free stochastic convex optimization infeasible, and provide a method
which is (partially) parameter-free up to the limit indicated by our lower
bound.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要