BiLSTM-MLAM: A Multi-Scale Time Series Prediction Model Based on Bi-LSTM and Local Attention Mechanism

Yongxin Fan, Qian Tang,Yangming Guo

crossref(2024)

引用 0|浏览2
暂无评分
摘要
Abstract This paper introduces BiLSTM-MLAM, a novel multi-scale time series prediction model. Initially, the approach utilizes Bidirectional Long Short-Term Memory to capture information from both forward and backward directions in time series data. Subsequently, a multi-scale patch segmentation module generates varies long sequences composed of equal-length segments, enabling the model to capture data patterns across multiple time scales by adjusting segment lengths. Finally, the local attention mechanism enhances feature extraction by accurately identifying and weighting important time segments, thereby strengthening the model's understanding of the local features of the time series, followed by feature fusion. The model demonstrates outstanding performance in time series prediction tasks by effectively capturing sequence information across various time scales. Experimental validation illustrates the superior performance of BiLSTM-MLAM compared to six baseline methods across multiple datasets. In predicting the remaining life of aircraft engines, BiLSTM-MLAM outperforms the best baseline model by 6.66% in RMSE and 11.50% in MAE. In the LTE dataset, it achieves RMSE improvements of 12.77% and MAE enhancements of 3.06%, while in the load dataset, it demonstrates RMSE enhancements of 17.96% and MAE improvements of 30.39%. Additionally, ablation experiments confirm the positive impact of each module on prediction accuracy. Through segment length parameter tuning experiments, combining different segment lengths has resulted in lower prediction errors, affirming the effectiveness of the multi-scale fusion strategy in enhancing prediction accuracy by integrating information from multiple time scales.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要