PWDformer: Deformable transformer for long-term series forecasting

PATTERN RECOGNITION(2024)

引用 0|浏览3
暂无评分
摘要
Long-term forecasting is of paramount importance in numerous scenarios, including predicting future energy, water, and food consumption. For instance, extreme weather events and natural disasters can profoundly impact infrastructure operations and pose severe safety concerns. Traditional CNN -based models often struggle to capture long-distance dependencies effectively. In contrast, Transformers -based models have shown significant promise in long-term forecasting. This paper investigates the long-term forecasting problem and identifies a common limitation in existing Transformer -based models: they tend to reduce computational complexity at the expense of time information aggregation capability. Moreover, the order of time series plays a crucial role in accurate predictions, but current Transformer -based models lack sensitivity to time series order, rendering them unreasonable. To address these issues, we propose a novel Deformable-Local (DL) aggregation mechanism. This mechanism enhances the model's ability to aggregate time information and allows the model to adaptively adjust the size of the time aggregation window. Consequently, the model can discern more complex time patterns, leading to more accurate predictions. Additionally, our model incorporates a Frequency Selection module to reinforce effective features and reduce noise. Furthermore, we introduce Position Weights to mitigate the order -insensitivity problem present in existing methods. In extensive evaluations of long-term forecasting tasks, we conducted benchmark tests on six datasets covering various practical applications, including energy, traffic, economics, weather, and disease. Our method achieved state-of-the-art (SOTA) results, demonstrating significant improvements. For instance, on the ETT dataset, our model achieved an average MSE improvement of approximately 19% and an average MAE improvement of around 27%. Remarkably, for predicted lengths of 96 and 192, we achieved outstanding MSE and MAE improvements of 32.1% and 30.9%, respectively.
更多
查看译文
关键词
Long-term forecasting,Time series forecasting,Deep learning,Transformer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要