Unified Training of Universal Time Series Forecasting Transformers
CoRR(2024)
摘要
Deep learning for time series forecasting has traditionally operated within a
one-model-per-dataset framework, limiting its potential to leverage the
game-changing impact of large pre-trained models. The concept of universal
forecasting, emerging from pre-training on a vast collection of time series
datasets, envisions a single Large Time Series Model capable of addressing
diverse downstream forecasting tasks. However, constructing such a model poses
unique challenges specific to time series data: i) cross-frequency learning,
ii) accommodating an arbitrary number of variates for multivariate time series,
and iii) addressing the varying distributional properties inherent in
large-scale data. To address these challenges, we present novel enhancements to
the conventional time series Transformer architecture, resulting in our
proposed Masked Encoder-based Universal Time Series Forecasting Transformer
(Moirai). Trained on our newly introduced Large-scale Open Time Series Archive
(LOTSA) featuring over 27B observations across nine domains, Moirai achieves
competitive or superior performance as a zero-shot forecaster when compared to
full-shot models. Code, model weights, and data will be released.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要