Hyper-differential sensitivity analysis with respect to model discrepancy: Optimal solution updating

arxiv(2023)

引用 1|浏览3
暂无评分
摘要
A common goal throughout science and engineering is to solve optimization problems constrained by computational models. However, in many cases a high-fidelity numerical emulation of systems cannot be optimized due to code complexity and computational costs which prohibit the use of intrusive and many query algorithms. Rather, lower-fidelity models are constructed to enable intrusive algorithms for large-scale optimization. As a result of the discrepancy between high and low-fidelity models, optimal solutions determined using low-fidelity models are frequently far from true optimality. In this article we introduce a novel approach that uses post-optimality sensitivities with respect to model discrepancy to update the optimization solution. Limited high-fidelity data is used to calibrate the model discrepancy in a Bayesian framework which in turn is propagated through post-optimality sensitivities of the low-fidelity optimization problem. Our formulation exploits structure in the post-optimality sensitivity operator to achieve computational scalability. Numerical results demonstrate how an optimal solution computed using a low-fidelity model may be significantly improved with limited evaluations of a high-fidelity model.
更多
查看译文
关键词
Hyper-differential sensitivity analysis, Post-optimality sensitivity analysis, PDE-constrained optimization, Model discrepancy
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要