Causal augmented ConvNet: A temporal memory dilated convolution model for long-sequence time series prediction

ISA Transactions(2022)

引用 18|浏览67
暂无评分
摘要
A number of deep learning models have been proposed to capture the inherent information in multivariate time series signals. However, most of the existing models are suboptimal, especially for long-sequence time series prediction tasks. This work presents a causal augmented convolution network (CaConvNet) and its application for long-sequence time series prediction. First, the model utilizes dilated convolution with enlarged receptive fields to enhance global feature extraction in time series. Secondly, to effectively capture the long-term dependency and to further extract multiscale features that represent different operating conditions, the model is augmented with a long-short term memory network. Thirdly, the CaConvNet is further optimized with a dynamic hyperparameter search algorithm to reduce uncertainties and the cost of manual hyperparameter selection. Finally, the model is extensively evaluated on a predictive maintenance task using the turbofan aircraft engine run-to-failure prognostic benchmark dataset (C-MAPSS). The performance of the proposed CaConvNet is also compared with four conventional deep learning models and seven different state-of-the-art predictive models. The evaluation metrics show that the proposed CaConvNet outperforms other models in most of the prognostic tasks. Moreover, a comprehensive ablation study is performed to provide insights into the contribution of each sub-structure of the CaConvNet model to the observed performance. The results of the ablation study as well as the performance improvement of CaConvNet are discussed in this paper.
更多
查看译文
关键词
Dilated convolution neural network,Deep learning,Remaining useful life,Time series,Predictive maintenance
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要