Two Trades is not Baffled: Condensing Graph via Crafting Rational Gradient Matching
CoRR(2024)
摘要
Training on large-scale graphs has achieved remarkable results in graph
representation learning, but its cost and storage have raised growing concerns.
As one of the most promising directions, graph condensation methods address
these issues by employing gradient matching, aiming to condense the full graph
into a more concise yet information-rich synthetic set. Though encouraging,
these strategies primarily emphasize matching directions of the gradients,
which leads to deviations in the training trajectories. Such deviations are
further magnified by the differences between the condensation and evaluation
phases, culminating in accumulated errors, which detrimentally affect the
performance of the condensed graphs. In light of this, we propose a novel graph
condensation method named CrafTing RationaL
trajectory (CTRL), which offers an optimized starting point closer to
the original dataset's feature distribution and a more refined strategy for
gradient matching. Theoretically, CTRL can effectively neutralize the impact of
accumulated errors on the performance of condensed graphs. We provide extensive
experiments on various graph datasets and downstream tasks to support the
effectiveness of CTRL. Code is released at
https://github.com/NUS-HPC-AI-Lab/CTRL.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要