Temporal self-attention-based Conv-LSTM network for multivariate time series prediction

Neurocomputing(2022)

引用 29|浏览37
暂无评分
摘要
Time series play an important role in many fields, such as industrial control, automated monitoring, and weather forecasting. Because there is often more than one variable in reality problems and they are related to each other, the multivariable time series (MTS) introduced. Using historical observations to accurately predict MTS is still very challenging. Therefore, a new time series prediction model proposed based on the temporal self-attention mechanism, convolutional neural network and long short-term memory (Conv-LSTM). When the standard attention mechanism for time series is combined with recurrent neural network (RNN), it heavily depends on the hidden state of the RNN. Particularly in the first time step, the initial hidden state (typically 0) must be artificially introduced to calculate the attention weight of that step, which results in additional noise in the calculation of the attention weight. To address this problem and increase the flexibility of the attention layer, a new self-attention mechanism designed to extract the temporal dependence of the MTS, which called temporal self-attention. In this attention mechanism, long short-term memory (LSTM) adopted as a sequence encoder to calculate the query, key, and value to obtain a more complete temporal dependence than standard self-attention. Because of flexibility of this structure, the DA-Conv-LSTM model was improved, in which a SOTA attention-based method used for MTS prediction. Our improved model compared with six baseline models on multiple datasets (SML2010 and NASDAQ100), and applied to satellite state prediction (our private dataset). The effectiveness of our temporal self-attention was demonstrated by experiments. And the best short-term prediction performance was achieved by our improved model.
更多
查看译文
关键词
Self-attention mechanism,Long short-term memory,Multivariate time series,Prediction
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要