Rapid and Low-Cost Evaluation of Multi-fidelity Scheduling Algorithms for Hyperparameter Optimization.

Aakash Mishra, Jeremy Hsu,Frank D'Agostino,Utku Sirin,Stratos Idreos

Intelligent Data Engineering and Automated Learning – IDEAL 2023: 24th International Conference, Évora, Portugal, November 22–24, 2023, Proceedings(2023)

引用 0|浏览4
暂无评分
摘要
Hyperparameter optimization (HPO), the process of searching for optimal hyperparameter configurations for a model, is becoming increasingly important with the rise of automated machine learning. Using HPO can be incredibly expensive as models increase in size. Multi-fidelity HPO schedulers attempt to more efficiently utilize resources given to each model configuration by determining the order they are evaluated in and the number of epochs they are run for. Pre-tabulated benchmarks are often used to reduce the compute power required to evaluate state-of-the-art schedulers. However, over-reliance on evaluation using these benchmarks can lead to overfitting. To solve this problem, we introduce a Platform for Hyperparameter Optimization Search Simulation (PHOSS), that enables rapid HPO scheduler evaluation by dynamically generating surrogate benchmarks. PHOSS uses a computationally efficient and expressive exponential decay parametric modeling approach to accurately generate surrogate benchmarks from real-world dataset samples without having to train an expensive surrogate model. We demonstrate that PHOSS can simulate several state-of-the-art schedulers on real-world benchmarks 4.5 × faster while also dramatically reducing multi-GPU compute requirements by enabling full testing of HPO schedulers on a single commodity CPU.
更多
查看译文
关键词
hyperparameter optimization,scheduling,algorithms,low-cost,multi-fidelity
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要