FLOOD: A Flexible Invariant Learning Framework for Out-of-Distribution Generalization on Graphs

KDD 2023(2023)

引用 6|浏览138
暂无评分
摘要
Graph Neural Networks (GNNs) have achieved remarkable success in various domains but most of them are developed under the in-distribution assumption. Under out-of-distribution (OOD) settings, they suffer from the distribution shift between the training set and the test set and may not generalize well to the test distribution. Several methods have tried the invariance principle to improve the generalization of GNNs in OOD settings. However, in previous solutions, the graph encoder is immutable after the invariant learning and cannot be adapted to the target distribution flexibly. Confronting the distribution shift, a flexible encoder with refinement to the target distribution can generalize better on the test set than the stable invariant encoder. To remedy these weaknesses, we propose a Flexible invariant Learning framework for Out-Of-Distribution generalization on graphs (FLOOD), which comprises two key components, invariant learning and bootstrapped learning. The invariant learning component constructs multiple environments from graph data augmentation and learns invariant representation under risk extrapolation. Besides, the bootstrapped learning component is devised to be trained in a self-supervised way with a shared graph encoder with the invariant learning part. During the test phase, the shared encoder is flexible to be refined with the bootstrapped learning on the test set. Extensive experiments are conducted for both transductive and inductive node classification tasks. The results demonstrate that FLOOD consistently outperforms other graph OOD generalization methods and effectively improves the generalization ability.
更多
查看译文
关键词
graph neural networks,out-of-distribution,invariant learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要