Semi-Supervised Contrast Learning Based on Multiscale Attention and Multitarget Contrast Learning for Bearing Fault Diagnosis

IEEE Transactions on Industrial Informatics(2023)

引用 5|浏览16
暂无评分
摘要
In the field of bearing fault diagnosis, it is difficult to obtain a large amount of labeled data for training, and therefore it is easy to overfit the model, which makes it less robust and less generalizable. To address this problem, this article offers a novel semi-supervised contrastive learning (SSCL) method based on a multiscale attention (MSA) mechanism and multitarget contrastive learning (MCL) for rolling bearing fault diagnosis under limited labeled samples. First, the proposed SSCL utilizes improved MCL to pretrain the model as a means of capturing potentially generic features in unlabeled data. Then, semi-supervised contrast learning is constructed and based on the pretrained model combining a limited amount of labeled data and a large amount of unlabeled data to jointly learn the feature mapping and further enhance the feature extraction capability of the model through an MSA mechanism. Case Western Reserve University, the Society for Mechanical Failure Prevention Techniques, and Paderborn University are common motor-bearing datasets that are used in this article for testing the proposed technique. As a result of the experimental findings, SSCL is demonstrated to be superior to other approaches in similar situations.
更多
查看译文
关键词
Bearing fault diagnosis,limited labeled,multiscale attention (MSA),multitarget contrast learning (MCL),semi-supervised learning (SSL)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要