Issues in calibrating models with multiple unbalanced constraints: the significance of systematic model and data errors

METHODS IN ECOLOGY AND EVOLUTION(2022)

引用 3|浏览14
暂无评分
摘要
Calibrating process-based models using multiple constraints often improves the identifiability of model parameters, helps to avoid several errors compensating each other and produces model predictions that are more consistent with underlying processes. However, using multiple constraints can lead to predictions for some variables getting worse. This is particularly common when combining data sources with very different sample sizes. Such unbalanced model-data fusion efforts are becoming increasingly common, for example when combining manual and automated measurements. Here we use a series of simulated virtual data experiments that aim to demonstrate and disentangle the underlying cause of issues that can occur when calibrating models with multiple unbalanced constraints in combination with systematic errors in models and data. We propose a diagnostic tool to help identify whether a calibration is failing due to these factors. We also test the utility of adding terms representing uncertainty in systematic model/data systematic error in calibrations. We show that unbalanced data by itself is not the problem-when fitting simulated data to the 'true' model, we can correctly recover model parameters and the true dynamics of latent variables. However, when there are systematic errors in the model or the data, we cannot recover the correct parameters. Consequently, the modelled dynamics of the low data volume variables departs significantly from the true values. We demonstrate the utility of the diagnostic tool and show that it can also be used to identify the extent of the imbalance before the calibration starts to ignore the more sparse data. Finally, we show that representing uncertainty in model structural errors and data biases in the calibration can greatly improve the model fit to low-volume data, and improve coverage of uncertainty estimates. We conclude that the underlying issue is not one of sample size or information content per se, despite the popularity of ad hoc approaches that focus on 'weighting' datasets to achieve balance. Our results emphasize the importance of considering model structural deficiencies and data systematic biases in the calibration of process-based models.
更多
查看译文
关键词
Bayesian inference,inverse modelling,model calibration,model discrepancy,multiple constraints,predictive uncertainty,structural model error,systematic data bias
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要