DropAGG: Robust Graph Neural Networks via Drop Aggregation.

Bo Jiang, Yong Chen,Beibei Wang, Haiyun Xu,Bin Luo

Neural networks : the official journal of the International Neural Network Society(2023)

引用 4|浏览10
暂无评分
摘要
Robust learning on graph data is an active research problem in data mining field. Graph Neural Networks (GNNs) have gained great attention in graph data representation and learning tasks. The core of GNNs is the message propagation mechanism across node's neighbors in GNNs' layer-wise propagation. Existing GNNs generally adopt the deterministic message propagation mechanism which may (1) perform non-robustly w.r.t structural noises and adversarial attacks and (2) lead to over-smoothing issue. To alleviate these issues, this work rethinks dropout techniques in GNNs and proposes a novel random message propagation mechanism, named Drop Aggregation (DropAGG), for GNNs learning. The core of DropAGG is to randomly select a certain rate of nodes to participate in information aggregation. The proposed DropAGG is a general scheme which can incorporate any specific GNN model to enhance its robustness and mitigate the over-smoothing issue. Using DropAGG, we then design a novel Graph Random Aggregation Network (GRANet) for graph data robust learning. Extensive experiments on several benchmark datasets demonstrate the robustness of GRANet and effectiveness of DropAGG to mitigate the issue of over-smoothing.
更多
查看译文
关键词
Drop aggregation,Graph neural networks,Graph random aggregation network,Robust data learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要