A Generalized Framework of Multifidelity Max-Value Entropy Search Through Joint Entropy

Neural Computation(2022)

引用 2|浏览23
暂无评分
摘要
Bayesian optimization (BO) is a popular method for expensive black-box optimization problems; however, querying the objective function at every iteration can be a bottleneck that hinders efficient search capabilities. In this regard, multifidelity Bayesian optimization (MFBO) aims to accelerate BO by incorporating lower-fidelity observations available with a lower sampling cost. In our previous work, we proposed an information-theoretic approach to MFBO, referred to as multifidelity max-value entropy search (MF-MES), which inherits practical effectiveness and computational simplicity of the well-known max-value entropy search (MES) for the single-fidelity BO. However, the applicability of MF-MES is still limited to the case that a single observation is sequentially obtained. In this letter, we generalize MF-MES so that information gain can be evaluated even when multiple observations are simultaneously obtained. This generalization enables MF-MES to address two practical problem settings: synchronous parallelization and trace-aware querying. We show that the acquisition functions for these extensions inherit the simplicity of MF-MES without introducing additional assumptions. We also provide computational techniques for entropy evaluation and posterior sampling in the acquisition functions, which can be commonly used for all variants of MF-MES. The effectiveness of MF-MES is demonstrated using benchmark functions and real-world applications such as materials science data and hyperparameter tuning of machine-learning algorithms.
更多
查看译文
关键词
search,max-value
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要