Hyperparameter Adaptive Search for Surrogate Optimization: A Self-Adjusting Approach

2023 Winter Simulation Conference (WSC)(2023)

引用 0|浏览0
暂无评分
摘要
Surrogate Optimization (SO) algorithms have shown promise for optimizing expensive black-box functions. However, their performance is heavily influenced by hyperparameters related to sampling and surrogate fitting, which poses a challenge to their widespread adoption. We investigate the impact of hyperparameters on various SO algorithms and propose a Hyperparameter Adaptive Search for SO (HASSO) approach. HASSO is not a hyperparameter tuning algorithm, but a generic self-adjusting SO algorithm that dynamically tunes its own hyperparameters while concurrently optimizing the primary objective function, without requiring additional evaluations. The aim is to improve the accessibility, effectiveness, and convergence speed of SO algorithms for practitioners. Our approach identifies and modifies the most influential hyperparameters specific to each problem and SO approach, reducing the need for manual tuning without significantly increasing the computational burden. Experimental results demonstrate the effectiveness of HASSO in enhancing the performance of various SO algorithms across different global optimization test problems.
更多
查看译文
关键词
Alternating Optimization,Adaptive Search,Optimization Algorithm,Performance Of Algorithm,Convergence Rate,Global Problem,Global Optimization,Hyperparameter Tuning,Test Problems,Global Test,Manual Tuning,Black-box Function,Expensive Function,Discretion,Computation Time,Upper Bound,Alternative Models,Objective Value,Beta Distribution,Sampling Step,Candidate Points,Acquisition Function,Multivariate Adaptive Regression Splines,Upper Confidence Bound,Local Convergence,Bandit Problem,Hyperparameter Values,Black-box Optimization,Multi-armed Bandit,Gaussian Process Model
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要