Revealing Decurve Flows for Generalized Graph Propagation

CoRR(2024)

引用 0|浏览14
暂无评分
摘要
This study addresses the limitations of the traditional analysis of message-passing, central to graph learning, by defining generalized propagation with directed and weighted graphs. The significance manifest in two ways. Firstly, we propose Generalized Propagation Neural Networks (GPNNs), a framework that unifies most propagation-based graph neural networks. By generating directed-weighted propagation graphs with adjacency function and connectivity function, GPNNs offer enhanced insights into attention mechanisms across various graph models. We delve into the trade-offs within the design space with empirical experiments and emphasize the crucial role of the adjacency function for model expressivity via theoretical analysis. Secondly, we propose the Continuous Unified Ricci Curvature (CURC), an extension of celebrated Ollivier-Ricci Curvature for directed and weighted graphs. Theoretically, we demonstrate that CURC possesses continuity, scale invariance, and a lower bound connection with the Dirichlet isoperimetric constant validating bottleneck analysis for GPNNs. We include a preliminary exploration of learned propagation patterns in datasets, a first in the field. We observe an intriguing “decurve flow” - a curvature reduction during training for models with learnable propagation, revealing the evolution of propagation over time and a deeper connection to over-smoothing and bottleneck trade-off.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要