Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting

Kashif Rasul,Arjun Ashok,Andrew Robert Williams,Hena Ghonia,Rishika Bhagwatkar, Arian Khorasani, Mohammad Javad Darvishi Bayazi, George Adamopoulos, Roland Riachi,Nadhir Hassen,Marin Biloš,Sahil Garg, Anderson Schneider,Nicolas Chapados,Alexandre Drouin, Valentina Zantedeschi, Yuriy Nevmyvaka,Irina Rish

CoRR(2023)

引用 0|浏览54
暂无评分
摘要
Over the past years, foundation models have caused a paradigm shift in machine learning due to their unprecedented capabilities for zero-shot and few-shot generalization. However, despite the success of foundation models in modalities such as natural language processing and computer vision, the development of foundation models for time series forecasting has lagged behind. We present Lag-Llama, a general-purpose foundation model for univariate probabilistic time series forecasting based on a decoder-only transformer architecture that uses lags as covariates. Lag-Llama is pretrained on a large corpus of diverse time series data from several domains, and demonstrates strong zero-shot generalization capabilities compared to a wide range of forecasting models on downstream datasets across domains. Moreover, when fine-tuned on relatively small fractions of such previously unseen datasets, Lag-Llama achieves state-of-the-art performance, outperforming prior deep learning approaches, emerging as the best general-purpose model on average. Lag-Llama serves as a strong contender to the current state-of-art in time series forecasting and paves the way for future advancements in foundation models tailored to time series data.
更多
查看译文
关键词
time series forecasting,foundation models,lag-llama
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要