Enhancing Performance of Coarsened Graphs with Gradient-Matching

ICASSP 2024 - 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)(2024)

引用 0|浏览1
暂无评分
摘要
Graph Neural Networks (GNNs) are powerful tools for processing graph data, but training them is typically time-consuming and expensive in terms of GPU memory. Recently, graph reduction techniques have been investigated as a pre-processing step, due to the fact that the time and space complexity of GNNs training is only sublinear on the reduced graphs. Among them, graph coarsening has gained significant attention. By generating a coarse graph while preserving graph structure properties, it can effectively reduce the size of nodes by up to a factor of ten without significantly compromising performance. However, the coarsening phase for this method is purely heuristic, leaving room for improvement. In this paper, we propose Learnable Graph Coarsening with Gradient Matching (LGCGM), boosting graph coarsening with (downstream)task-specific dataset condensation objective. To the best of our knowledge, LGCGM is the first work that incorporates downstream-task information for graph coarsening. Extensive experiments demonstrate that this additional information efficiently enhances the quality of coarse graphs and outperforms previous graph condensation methods on large graphs.
更多
查看译文
关键词
Graph Neural Networks,Graph Coarsening,Dataset Condensation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要