Learning Graph Attentions via Replicator Dynamics.

IEEE transactions on pattern analysis and machine intelligence(2024)

引用 0|浏览5
暂无评分
摘要
Graph Attention (GA) which aims to learn the attention coefficients for graph edges has achieved impressive performance in GNNs on many graph learning tasks. However, existing GAs are usually learned based on edges' (or connected nodes') features which fail to fully capture the rich structural information of edges. Some recent research attempts to incorporate the structural information into GA learning but how to fully exploit them in GA learning is still a challenging problem. To address this challenge, in this work, we propose to leverage a new Replicator Dynamics model for graph attention learning, termed Graph Replicator Attention (GRA). The core of GRA is our derivation of replicator dynamics based sparse attention diffusion which can explicitly learn context-aware and sparse preserved graph attentions via a simple self-supervised way. Moreover, GRA can be theoretically explained from an energy minimization model. This provides a more theoretical justification for the proposed GRA method. Experiments on several graph learning tasks demonstrate the effectiveness and advantages of the proposed GRA method on ten benchmark datasets.
更多
查看译文
关键词
Graph Attention Network,Replicator Dynamics,Graph Diffusion,Graph Neural Networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要