Explainable machine learning model for multi-step forecasting of reservoir inflow with uncertainty quantification

Environmental Modelling & Software(2023)

引用 0|浏览8
暂无评分
摘要
We propose an explainable machine learning (ML) model with uncertainty quantification (UQ) to improve multi-step reservoir inflow forecasting. Traditional ML methods have challenges in forecasting inflows multiple days ahead, and lack explainability and UQ. To address these limitations, we introduce an encoder–decoder long short-term memory (ED-LSTM) network for multi-step forecasting, employ the SHapley Additive exPlanation (SHAP) technique for understanding the influence of hydrometeorological factors on inflow prediction, and develop a novel UQ method for prediction trustworthiness. We apply these methods to forecast 7-day inflow in snow-dominant and rain-driven reservoirs. The results demonstrate the effectiveness of the ED-LSTM model, with high forecasting accuracy for short lead times. Our UQ method provides reliable uncertainty estimates, covering 90% of data with a 90% confidence level. The SHAP analysis reveals the importance of historical inflow and precipitation as influential factors. These findings and methods may support reservoir operators in optimizing water resources management decisions.
更多
查看译文
关键词
Multi-step forecasting,Uncertainty quantification,Explainable machine learning,Encoder–decoder LSTM,SHAP,Reservoir inflow
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要