Self-Attention Transformer-Based Architecture for Remaining Useful Life Estimation of Complex Machines

Procedia Computer Science(2023)

引用 0|浏览22
暂无评分
摘要
Meaningful feature extraction from multivariate time-series data is still challenging since it takes into account the correlation between pairs of sensors as well as the temporal information of each time-series. Meanwhile, the huge industrial system has evolved into a data-rich environment, resulting in the rapid development and deployment of deep learning for machine RUL prediction. RUL (Remaining Useful Life) examines a system's behavior over the course of its lifetime, that is, from the last inspection to when the system's performance deteriorates beyond a certain point. RUL has been addressed using Long-Short-Term Memory (LSTM) and Convolution Neural Network (CNN), particularly in complex tasks involving high-dimensional nonlinear data. The main focus, however, has been on degradation data. In 2021, a new realistic run-to-failure turbofan engine degradation dataset was released, which differs significantly from the simulation dataset. The key difference is that each cycle's flight duration varies, so the existing deep technique will be ineffective at predicting the RUL for real-world degradation data. We present a Self-Attention Transformer-Based Encoder model to address this problem. The encoder with the time-stamp encoder layer works in parallel to extract features from various sensors at various time stamps. Self-attention enables efficient processing of extended sequences and focuses on key elements of the input time series. Self-attention is used in the proposed Transformer model to access global characteristics from diverse time-series representations. Under real-world flight conditions, we conduct tests on turbofan engine degradation data using variable-length input. The proposed approach for estimating RUL of turbofan engines appears to be efficient based on empirical results.
更多
查看译文
关键词
Remaining Useful Life (RUL),Deep Learning (DL),Self-Attention (SA),Transformer Model (TM)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要