An Uncertainty-Aware Measure of Model Calibration Flexibility

Conference proceedings of the Society for Experimental Mechanics(2023)

引用 0|浏览9
暂无评分
摘要
Physics-based models of structural dynamic systems are needed for various engineering applications, including structural controls and condition monitoring. These models often need to be calibrated against experimental measurements to mitigate uncertainties in poorly known model parameters and account for systemic model errors. In such a calibration campaign, under- or overfitting of a model to measured data may impede obtaining generalizable predictions. The underfitted calibration campaign fails to fully capture the underlying patterns, misses out on opportunities to learn from the measured data, and leads to an inferior predictive capability; while the overfitted calibration campaign may yield a satisfactory goodness-of-fit, it degrades generalizability, and in turn the usefulness of the calibrated model. There is a well-known trade-off between goodness-of-fit to measured data and generalizability in unmeasured settings. In this context, the generalizability of a model calibration campaign denotes the ability of the model to fit alternative datasets. For a given set of available experiments, determining the optimal flexibility of a model calibration campaign is necessary to achieve the maximum possible generalizability. This work presents a generally applicable metric to quantify the flexibility of model calibration that effectively takes these factors into account. We present the computational framework for the metric and demonstrate its application on a polynomial problem.
更多
查看译文
关键词
model calibration flexibility,uncertainty-aware
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要