Water Quality Prediction Based on the KF-LSTM Encoder-Decoder Network: A Case Study with Missing Data Collection

WATER(2023)

引用 0|浏览0
暂无评分
摘要
This paper focuses on water quality prediction in the presence of a large number of missing values in water quality monitoring data. Current water quality monitoring data mostly come from different monitoring stations in different water bodies. As the duration of water quality monitoring increases, the complexity of water quality data also increases, and missing data is a common and difficult to avoid problem in water quality monitoring. In order to fully exploit the valuable features of the monitored data and improve the accuracy of water quality prediction models, we propose a long short-term memory (LSTM) encoder-decoder model that combines a Kalman filter (KF) with an attention mechanism. The Kalman filter in the model can quickly complete the reconstruction and pre-processing of hydrological data. The attention mechanism is added between the decoder and the encoder to solve the problem that traditional recursive neural network models lose long-range information and fully exploit the interaction information among high-dimensional covariate data. Using original data from the Haimen Bay water quality monitoring station in the Lianjiang River Basin for analysis, we trained and tested our model using detection data from 1 January 2019 to 30 June 2020 to predict future water quality. The results show that compared with traditional LSTM models, KF-LSTM models reduce the average absolute error (MAE) by 10%, the mean square error (MSE) by 21.2%, the root mean square error (RMSE) by 13.2%, while increasing the coefficient of determination (R2) by 4.5%. This model is more suitable for situations where there are many missing values in water quality data, while providing new solutions for real-time management of urban aquatic environments.
更多
查看译文
关键词
Kalman filter,LSTM,encoder-decoder,time series prediction,water quality prediction,attention mechanism
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要