Construction of Power Fault Knowledge Graph Based on Deep Learning

APPLIED SCIENCES-BASEL(2022)

引用 4|浏览4
暂无评分
摘要
A knowledge graph can structure heterogeneous knowledge in the field of power faults, construct the correlation between different pieces of knowledge, and solve the diversification, complexity, and island of fault data. There are many kinds of entities in power fault defect text, the relationship between entities is complex, and the data are often mixed with noise. It is necessary to research how to effectively mine the target data and separate the salient knowledge from the noise. Moreover, the traditional entity and relationship extraction methods used in the construction of a power fault knowledge graph cannot fully understand the text semantics, and the response accuracy is low. The Log system usually contains all kinds of information related to faults and a log analysis helps us collect fault information and perform association analysis. Therefore, a Bidirectional Sliced GRU with Gated Attention mechanism (BiSGRU-GA) model is proposed to detect the anomalous logs in the power system, this enriches the fault knowledge base and provides a good data resource for the construction of the knowledge graph. A new Bidirectional GRU with Gated Attention mechanism and Conditional Random Fields and a BERT input layer (BBiGRU-GA-CRF) model is proposed by introducing a BERT layer and Attention Mechanism into the Bidirectional GRU (BiGRU) model to more fully understand the context information of fault sentences and improve the accuracy of entity recognition of fault sentences. Aiming to solve the problems of large calculation cost and propagation error which occur in the traditional relationship extraction model, an improved Bidirectional Gated Recurrent Unit neural network with fewer parameters and the Gated Attention Mechanism (BiGRU-GA) model is proposed. This new model introduces an improved Gated Attention Mechanism to achieve better effects in relationship extraction. Compared with Bidirectional Long Short-Term Memory with Attention Mechanism (BiLSTM-Attention), the accuracy, recall, and F-measure of the model were improved by 1.79%, 13.83%, and 0.30% respectively, and the time cost is reduced by about 16%. The experimental results show that the BiGRU-GA model can capture local features, reduce the training time cost, and improve the model recognition effect.
更多
查看译文
关键词
power failure, knowledge graph, attention mechanism, GRU, BERT
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要