A Gated MLP Architecture for Learning Topological Dependencies in Spatio-Temporal Graphs
CoRR(2024)
摘要
Graph Neural Networks (GNNs) and Transformer have been increasingly adopted
to learn the complex vector representations of spatio-temporal graphs,
capturing intricate spatio-temporal dependencies crucial for applications such
as traffic datasets. Although many existing methods utilize multi-head
attention mechanisms and message-passing neural networks (MPNNs) to capture
both spatial and temporal relations, these approaches encode temporal and
spatial relations independently, and reflect the graph's topological
characteristics in a limited manner. In this work, we introduce the Cycle to
Mixer (Cy2Mixer), a novel spatio-temporal GNN based on topological non-trivial
invariants of spatio-temporal graphs with gated multi-layer perceptrons (gMLP).
The Cy2Mixer is composed of three blocks based on MLPs: A message-passing block
for encapsulating spatial information, a cycle message-passing block for
enriching topological information through cyclic subgraphs, and a temporal
block for capturing temporal properties. We bolster the effectiveness of
Cy2Mixer with mathematical evidence emphasizing that our cycle message-passing
block is capable of offering differentiated information to the deep learning
model compared to the message-passing block. Furthermore, empirical evaluations
substantiate the efficacy of the Cy2Mixer, demonstrating state-of-the-art
performances across various traffic benchmark datasets.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要