Scalable and Effective Temporal Graph Representation Learning With Hyperbolic Geometry.

IEEE transactions on neural networks and learning systems(2024)

引用 0|浏览1
暂无评分
摘要
Real-life graphs often exhibit intricate dynamics that evolve continuously over time. To effectively represent continuous-time dynamic graphs (CTDGs), various temporal graph neural networks (TGNNs) have been developed to model their dynamics and topological structures in Euclidean space. Despite their notable achievements, the performance of Euclidean-based TGNNs is limited and bounded by the representation capabilities of Euclidean geometry, particularly for complex graphs with hierarchical and power-law structures. This is because Euclidean space does not have enough room (its volume grows polynomially with respect to radius) to learn hierarchical structures that expand exponentially. As a result, this leads to high-distortion embeddings and suboptimal temporal graph representations. To break the limitations and enhance the representation capabilities of TGNNs, in this article, we propose a scalable and effective TGNN with hyperbolic geometries for CTDG representation (called STGNh ). It captures evolving behaviors and stores hierarchical structures simultaneously by integrating a memory-based module and a structure-based module into a unified framework, which can scale to billion-scale graphs. Concretely, a simple hyperbolic update gate (HuG) is designed as the memory-based module to store temporal dynamics efficiently; for the structure-based module, we propose an effective hyperbolic temporal Transformer (HyT) model to capture complex graph structures and generate up-to-date node embeddings. Extensive experimental results on a variety of medium-scale and billion-scale graphs demonstrate the superiority of the proposed STGNh for CTDG representation, as it significantly outperforms baselines in various downstream tasks.
更多
查看译文
关键词
Graph neural networks,hyperbolic geometry,large-scale graph processing,temporal graph
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要