Adaptive graph contrastive learning with joint optimization of data augmentation and graph encoder

Zhenpeng Wu, Jiamin Chen,Raeed Al-Sabri, Babatounde Moctard Oloulade,Jianliang Gao

Knowledge and Information Systems(2024)

引用 0|浏览4
暂无评分
摘要
Graph contrastive learning (GCL) has been successfully used to solve the problem of the huge cost of graph data annotation, such as labor cost, time cost, and professional knowledge cost. Recent works have focused on improving the generalization performance of GCL with automated data augmentation. However, GCL methods with automated data augmentation encode the graph representation using fixed graph encoders, which will result in performance loss. To overcome this limitation, we propose Adaptive graph contrastive learning with joint optimization of data Augmentation and graph Encoder (AdaAE). AdaAE is the first method to learn to adapt the graph encoder for each dataset for GCL with automated data augmentation. Specifically, we design a unified GCL search space, which means that we treat both data augmentation and graph encoder as architecture components. AdaAE employs the adaptive architecture optimizer based on a differentiable search method to learn the sample probability distribution of each architecture component. The adaptive architecture optimizer generates an operation of each architecture component based on the sample probability distribution to construct the GCL model. Then, the validation results of the GCL model are used as the feedback signal of the adaptive architecture optimizer to optimize the sample probability distribution of each architecture component. Extensive experiments demonstrate that AdaAE performs more superior than the state-of-the-art baselines. Furthermore, the visualization results confirm that AdaAE can clearly distinguish different classes in the projection space.
更多
查看译文
关键词
Graph contrastive learning,Graph neural network,Graph neural architecture search,Data augmentation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要