ConCur

Rong Yan,Peng Bao

Neurocomputing(2023)

引用 0|浏览8
暂无评分
摘要
Contrastive learning has made breakthrough advancements in graph representation learning, which encourages the representation of positive samples to be close and those of negative samples to be far away. However, existing graph contrastive learning (GCL) frameworks have made great efforts toward designing different augmentation strategies for positive samples, while randomly utilizing all other nodes as negative samples and treating them equally, completely ignoring the differences between negative samples. Moreover, almost every GCL framework replaces original graph with different augmented views, which may lead to unexpected information missing caused by randomly perturbing edges and features. To address these issues, we propose a self-supervised graph Con trastive learning framework with Cur riculum negative sampling, called ConCur , which feeds negative samples in an easy-to-hard fashion for contrastive learning by performing our proposed curriculum negative sampling strategy. More specifically, ConCur consists of two phases: Graph Augmentations and Curriculum Contrastive Training. Graph Augmentations aim at constructing positive and negative samples through different graph augmentation strategies. In the Curriculum Contrastive Training, we first utilize a triplet network to learn node representations by receiving original graph and different augmented views as input. Then, we propose a curriculum negative sampling strategy to enumerate negative samples from easy to hard for contrastive training. Finally, we utilize a unified contrastive loss to optimize node representations. Comprehensive experiments on five real-world datasets reveal that ConCur yields substantial relative encouraging results on the node classification task.
更多
查看译文
关键词
Graph representation learning,Contrastive learning,Curriculum learning,Graph augmentation,Triplet network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要