Electric Power Audit Text Classification With Multi-Grained Pre-Trained Language Model

Qinglin Meng,Yan Song, Jian Mu, Yuanxu Lv,Jiachen Yang,Liang Xu,Jin Zhao, Junwei Ma, Wei Yao,Rui Wang, Maoxiang Xiao, Qingyu Meng

IEEE Access(2023)

引用 2|浏览26
暂无评分
摘要
Electric power audit text classification is one of the important research problem in electric power systems. Recently, kinds of automatic classification methods for these texts based on machine learning or deep learning models have been applied. At present, the development of computing technology makes "pre-training and fine-tuning " the newest paradigm of text classification, which achieves better results than previous fully-supervised models. Based on pre-training theory, domain-related pre-training tasks can enhance the performance of downstream tasks in the specific domain. However, existing pre-training models usually use general corpus for pre-training, and do not use texts related to the field of electric power, especially electric power audit texts. This results in that the model does not learn too much electric-power-related morphology or semantics in the pre-training stage, so that less information can be used in the fine-tuning stage. Based on the research status, in this paper, we propose EPAT-BERT, a BERT-based model pre-trained by two-granularity pre-training tasks: word-level masked language model and entity-level masked language model. These two tasks predict word and entity in electric-power-related texts to learn abundant morphology and semantics about electric power. We then fine-tune EPAT-BERT for electric power audit text classification task. The experimental results show that, compared with fully supervised machine learning models, neural network models, and general pre-trained language models, EPAT-BERT can significantly outperform existing models in a variety of evaluation metrics. Therefore, EPAT-BERT can be further applied to electric power audit text classification. We also conduct ablation studies to prove the effectiveness of each component in EPAT-BERT to further illustrate our motivations.
更多
查看译文
关键词
Power systems,Task analysis,Text categorization,Bit error rate,Data models,Computational modeling,Natural language processing,Pre-trained language model,text classification,electric power audit text,natural language processing,masked language model
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要