A Non-Nested Infilling Strategy For Multifidelity Based Efficient Global Optimization

INTERNATIONAL JOURNAL FOR UNCERTAINTY QUANTIFICATION(2021)

引用 1|浏览11
暂无评分
摘要
Efficient global optimization (EGO) has become a standard approach for the global optimization of complex systems with high computational costs. EGO uses a training set of objective function values computed at selected input points to construct a statistical surrogate model, with low evaluation cost, on which the optimization procedure is applied. The training set is sequentially enriched, selecting new points, according to a prescribed infilling strategy, in order to converge to the optimum of the original costly model. Multifidelity approaches combining evaluations of the quantity of interest at different fidelity levels have been recently introduced to reduce the computational cost of building a global surrogate model. However, the use of multifidelity approaches in the context of EGO is still a research topic. In this work, we propose a new effective infilling strategy for multifidelity EGO. Our infilling strategy has the particularity of relying on non-nested training sets, a characteristic that comes with several computational benefits. For the enrichment of the multifidelity training set, the strategy selects the next input point together with the fidelity level of the objective function evaluation. This characteristic is in contrast with previous nested approaches, which require estimation of all lower fidelity levels and are more demanding to update the surrogate. The resulting EGO procedure achieves a significantly reduced computational cost, avoiding computations at useless fidelity levels whenever possible, but it is also more robust to low correlations between levels and noisy estimations. Analytical problems are used to test and illustrate the efficiency of the method. It is finally applied to the optimization of a fully nonlinear fluid-structure interaction system to demonstrate its feasibility on real large-scale problems, with fidelity levels mixing physical approximations in the constitutive models and discretization refinements.
更多
查看译文
关键词
global optimization, Gaussian process model, multifidelity, non-nested datasets
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要