Self-Paced Co-Training of Graph Neural Networks for Semi-Supervised Node Classification

IEEE transactions on neural networks and learning systems(2023)

引用 22|浏览39
暂无评分
摘要
Graph neural networks (GNNs) have demonstrated great success in many graph data-based applications. The impressive behavior of GNNs typically relies on the availability of a sufficient amount of labeled data for model training. However, in practice, obtaining a large number of annotations is prohibitively labor-intensive and even impossible. Co-training is a popular semi-supervised learning (SSL) paradigm, which trains multiple models based on a common training set while augmenting the limited amount of labeled data used for training each model via the pseudolabeled data generated from the prediction results of other models. Most of the existing co-training works do not control the quality of pseudolabeled data when using them. Therefore, the inaccurate pseudolabels generated by immature models in the early stage of the training process are likely to cause noticeable errors when they are used for augmenting the training data for other models. To address this issue, we propose a self-paced co-training for the GNN (SPC-GNN) framework for semi-supervised node classification. This framework trains multiple GNNs with the same or different structures on different representations of the same training data. Each GNN carries out SSL by using both the originally available labeled data and the augmented pseudolabeled data generated from other GNNs. To control the quality of pseudolabels, a self-paced label augmentation strategy is designed to make the pseudolabels generated at a higher confidence level to be utilized earlier during training such that the negative impact of inaccurate pseudolabels on training data augmentation, and accordingly, the subsequent training process can be mitigated. Finally, each of the trained GNN is evaluated on a validation set, and the best-performing one is chosen as the output. To improve the training effectiveness of the framework, we devise a pretraining followed by a two-step optimization scheme to train GNNs. Experimental results on the node classification task demonstrate that the proposed framework achieves significant improvement over the state-of-the-art SSL methods.
更多
查看译文
关键词
Training,Data models,Task analysis,Graph neural networks,Training data,Predictive models,Optimization,Co-training,graph neural networks (GNNs),node classification,self-paced learning (SPL),semi-supervised learning (SSL)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要