Self-Supervised Contrastive Forecasting
CoRR(2024)
摘要
Long-term forecasting presents unique challenges due to the time and memory
complexity of handling long sequences. Existing methods, which rely on sliding
windows to process long sequences, struggle to effectively capture long-term
variations that are partially caught within the short window (i.e.,
outer-window variations). In this paper, we introduce a novel approach that
overcomes this limitation by employing contrastive learning and enhanced
decomposition architecture, specifically designed to focus on long-term
variations. To this end, our contrastive loss incorporates global
autocorrelation held in the whole time series, which facilitates the
construction of positive and negative pairs in a self-supervised manner. When
combined with our decomposition networks, our contrastive learning
significantly improves long-term forecasting performance. Extensive experiments
demonstrate that our approach outperforms 14 baseline models in multiple
experiments over nine long-term benchmarks, especially in challenging scenarios
that require a significantly long output for forecasting. Source code is
available at
https://github.com/junwoopark92/Self-Supervised-Contrastive-Forecsating.
更多查看译文
关键词
Contrastive learning,time-series forecasting,long-term representation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要