Towards Robust Graph Incremental Learning on Evolving Graphs
ICML 2023(2024)
摘要
Incremental learning is a machine learning approach that involves training a
model on a sequence of tasks, rather than all tasks at once. This ability to
learn incrementally from a stream of tasks is crucial for many real-world
applications. However, incremental learning is a challenging problem on
graph-structured data, as many graph-related problems involve prediction tasks
for each individual node, known as Node-wise Graph Incremental Learning (NGIL).
This introduces non-independent and non-identically distributed characteristics
in the sample data generation process, making it difficult to maintain the
performance of the model as new tasks are added. In this paper, we focus on the
inductive NGIL problem, which accounts for the evolution of graph structure
(structural shift) induced by emerging tasks. We provide a formal formulation
and analysis of the problem, and propose a novel regularization-based technique
called Structural-Shift-Risk-Mitigation (SSRM) to mitigate the impact of the
structural shift on catastrophic forgetting of the inductive NGIL problem. We
show that the structural shift can lead to a shift in the input distribution
for the existing tasks, and further lead to an increased risk of catastrophic
forgetting. Through comprehensive empirical studies with several benchmark
datasets, we demonstrate that our proposed method,
Structural-Shift-Risk-Mitigation (SSRM), is flexible and easy to adapt to
improve the performance of state-of-the-art GNN incremental learning frameworks
in the inductive setting.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要